12 research outputs found

    Evaluation of Functional Data Models for Database Design and Use

    Get PDF
    The problems of design, operation, and maintenance of databases using the three most popular database management systems (Hierarchical, CQDASYL/DBTG, and Relational) are well known. Users wishing to use these systems have to make conscious and often complex mappings between the real-world structures and the data structuring options (data models) provided by these systems. In addition, much of the semantics associated with the data either does not get expressed at all or gets embedded procedurally in application programs in an ad-hoc way. In recent years, a large number of data models (called semantic data models) have been proposed with the aim of simplifying database design and use. However, the lack of usable implementations of these proposals has so far inhibited the widespread use of these concepts. The present work reports on an effort to evaluate and extend one such semantic model by means of an implementation. It is based on the functional data model proposed earlier by Shipman[SHIP81). We call this 'Extended Functional Data Model' (EFDM). EFDM, like Shipman's proposals, is a marriage of three of the advanced modelling concepts found in both database and artificial intelligence research: the concept of entity to represent an object in the real world, the concept of type hierarchy among entity types, and the concept of derived data for modelling procedural knowledge. The functional notation of the model lends itself to high level data manipulation languages. The data selection in these languages is expressed simply as function application. Further, the functional approach makes it possible to incorporate general purpose computation facilities in the data languages without having to embed them in procedural languages. In addition to providing the usual database facilities, the implementation also provides a mechanism to specify multiple user views of the database

    Exploring parallelism with object oriented database management system

    Get PDF
    The object oriented approach to database management systems aims to remove the limitations of the current systems by providing enhanced semantic capabilities and more flexible facilities, including the encapsulation of operations as well as data in the specification of an object. Such systems are certainly more complex than existing database management systems. Although, they are complex, the current object oriented database management systems are built for Von-Neumann (purely sequential) machines. Such implementation inevitably leads to major problems involving efficiency and performance. So, new techniques for implementation need to be investigated. One possible solution for the efficiency, and performance problems is to use parallel processing techniques. Thus, the aim of this research is to propose aspects in which parallel processing can be introduced within the scope of object oriented database management systems and identify ways in which the performance can be improved. A prototype of the main components of an object oriented database system called KBZ has been implemented to test out some of the parallel processing aspects. The thesis starts with an introduction and background to the research. It then describes major parallel system architectures for an object oriented database management system. Techniques such as distributing a large volume of data among various processors (transputers), performing processing in the background of the system to reduce response time, and performing input/output parallel processing are presented. The initial prototype, PKBZ version-1, is then described; in particular, the logical and physical representation of object classes, how they communicate through message sending, and the different types of message supported. Two prototype versions exist. The initial prototype was designed to investigate the parallel implementation and general functionality of the system. The second version provides greater flexibility and incorporates enhanced functionality to allow experimentation. The enhancements in the second version are also discussed in the thesis, and the experimental results using different transputer configurations are illustrated and analyzed

    The exploration of a category theory-based virtual Geometrical product specification system for design and manufacturing

    Get PDF
    In order to ensure quality of products and to facilitate global outsourcing, almost all the so-called “world-class” manufacturing companies nowadays are applying various tools and methods to maintain the consistency of a product’s characteristics throughout its manufacturing life cycle. Among these, for ensuring the consistency of the geometric characteristics, a tolerancing language − the Geometrical Product Specification (GPS) has been widely adopted to precisely transform the functional requirements from customers into manufactured workpieces expressed as tolerance notes in technical drawings. Although commonly acknowledged by industrial users as one of the most successful efforts in integrating existing manufacturing life-cycle standards, current GPS implementations and software packages suffer from several drawbacks in their practical use, possibly the most significant, the difficulties in inferring the data for the “best” solutions. The problem stemmed from the foundation of data structures and knowledge-based system design. This indicates that there need to be a “new” software system to facilitate GPS applications. The presented thesis introduced an innovative knowledge-based system − the VirtualGPS − that provides an integrated GPS knowledge platform based on a stable and efficient database structure with knowledge generation and accessing facilities. The system focuses on solving the intrinsic product design and production problems by acting as a virtual domain expert through translating GPS standards and rules into the forms of computerized expert advices and warnings. Furthermore, this system can be used as a training tool for young and new engineers to understand the huge amount of GPS standards in a relative “quicker” manner. The thesis started with a detailed discussion of the proposed categorical modelling mechanism, which has been devised based on the Category Theory. It provided a unified mechanism for knowledge acquisition and representation, knowledge-based system design, and database schema modelling. As a core part for assessing this knowledge-based system, the implementation of the categorical Database Management System (DBMS) is also presented in this thesis. The focus then moved on to demonstrate the design and implementation of the proposed VirtualGPS system. The tests and evaluations of this system were illustrated in Chapter 6. Finally, the thesis summarized the contributions to knowledge in Chapter 7. After thoroughly reviewing the project, the conclusions reached construe that the III entire VirtualGPS system was designed and implemented to conform to Category Theory and object-oriented programming rules. The initial tests and performance analyses show that the system facilitates the geometric product manufacturing operations and benefits the manufacturers and engineers alike from function designs, to a manufacturing and verification

    The exploration of a category theory-based virtual geometrical product specification system for design and manufacturing

    Get PDF
    In order to ensure quality of products and to facilitate global outsourcing, almost all the so-called “world-class” manufacturing companies nowadays are applying various tools and methods to maintain the consistency of a product’s characteristics throughout its manufacturing life cycle. Among these, for ensuring the consistency of the geometric characteristics, a tolerancing language − the Geometrical Product Specification (GPS) has been widely adopted to precisely transform the functional requirements from customers into manufactured workpieces expressed as tolerance notes in technical drawings. Although commonly acknowledged by industrial users as one of the most successful efforts in integrating existing manufacturing life-cycle standards, current GPS implementations and software packages suffer from several drawbacks in their practical use, possibly the most significant, the difficulties in inferring the data for the “best” solutions. The problem stemmed from the foundation of data structures and knowledge-based system design. This indicates that there need to be a “new” software system to facilitate GPS applications. The presented thesis introduced an innovative knowledge-based system − the VirtualGPS − that provides an integrated GPS knowledge platform based on a stable and efficient database structure with knowledge generation and accessing facilities. The system focuses on solving the intrinsic product design and production problems by acting as a virtual domain expert through translating GPS standards and rules into the forms of computerized expert advices and warnings. Furthermore, this system can be used as a training tool for young and new engineers to understand the huge amount of GPS standards in a relative “quicker” manner. The thesis started with a detailed discussion of the proposed categorical modelling mechanism, which has been devised based on the Category Theory. It provided a unified mechanism for knowledge acquisition and representation, knowledge-based system design, and database schema modelling. As a core part for assessing this knowledge-based system, the implementation of the categorical Database Management System (DBMS) is also presented in this thesis. The focus then moved on to demonstrate the design and implementation of the proposed VirtualGPS system. The tests and evaluations of this system were illustrated in Chapter 6. Finally, the thesis summarized the contributions to knowledge in Chapter 7. After thoroughly reviewing the project, the conclusions reached construe that the III entire VirtualGPS system was designed and implemented to conform to Category Theory and object-oriented programming rules. The initial tests and performance analyses show that the system facilitates the geometric product manufacturing operations and benefits the manufacturers and engineers alike from function designs, to a manufacturing and verification.EThOS - Electronic Theses Online ServiceGBUnited Kingdo

    On the Utilisation of Persistent Programming Environments

    Get PDF
    There is a growing gap between the supply and demand of good quality software, which is primarily due to the difficulty of the programming task and the poor level of support for programmers. Programming is carried out using software tools which do not match very well either real world understanding of a problem or even the other tools which need to be used. In every phase of software production, the programmer must master new tools which function in a different way from each other. The Persistent Programming Paradigm attempts to reduce these problems by providing a programming environment which gives consistent methods of accessing program values of various kinds. Long-term and short-term data are treated in the same way. Numbers, text, graphical values and even program objects are all referred to in the same consistent way. Languages which support persistence provide considerable power within a simple environment, so that programmers can perform most if not all parts of the programming task in a coherent and uniform manner. This thesis tests the hypothesis that programmers do in fact derive some benefit from this - the simplification of the program and faster implementation of complex programs. The persistent language PS-algol is introduced and used to build: user-interface and compiler tools; a database application; some data modelling tools, both relational and semantic; a rapid prototyping system; an object-oriented language; and software support systems. In doing so, the thesis demonstrates the breadth of work which can be achieved using a Persistent Programming Language, and the ease with which these various projects can be implemented. Further, the thesis derives the beginnings of a methodology for using such a language and analyses how PS-algol could be improved. In doing so, the work aims to put the Persistent Programming Paradigm on a firm basis following significant use and experimentation

    A parallel process model and architecture for a Pure Logic Language

    Get PDF
    The research presented in this thesis has been concerned with the use of parallel logic systems for the implementation of large knowledge bases. The thesis describes proposals for a parallel logic system based on a new logic programming language, the Pure Logic Language. The work has involved the definition and implementation of a new logic interpreter which incorporates the parallel execution of independent OR processes, and the specification and design of an appropriate non shared memory multiprocessor architecture. The Pure Logic Language which is under development at JeL, Bracknell, differs from Prolog in its expressive powers and implementation. The resolution based Prolog approach is replaced by a rewrite rule technique which successively transforms expressions according to logical axioms and user defined rules until no further rewrites are possible. A review of related work in the field of parallel logic language systems is presented. The thesis describes the different forms of parallelism within logic languages and discusses the decision to concentrate on the efficient implementation of OR parallelism. The parallel process model for the Pure Logic Language uses the same execution technique of rule rewriting but has been adapted to implement the creation of independent OR processes and the required message passing operations. The parallelism in the system is implemented automatically and, unlike many other parallel logic systems there are no explicit program annotations for the control of parallel execution. The spawning of processes involves computational overheads within the interpreter: these have been measured and results are presented. The functional requirements of a multiprocessor architecture are discussed: shared memory machines are not scalable for large numbers of processing elements, but, with no shared memory, data needed by offspring processors must be copied from the parent or else recomputed. The thesis describes an optimised format for the copying of data between processors. Because a one-to-many communication pattern exits between parent and offspring processors a broadcast architecture is indicated. The development of a system based on the broadcasting of data packets represents a new approach to the parallel execution of logic languages and has led to the design of a novel bus based multiprocessor architecture. A simulation of this multiprocessor architecture has been produced and the parallel logic interpreter mapped onto it: this provides data on the predicted performance of the system. A detailed analysis of these results is presented and the implications for future developments to the proposed system are discussed.</p

    An Introduction to Database Systems

    Get PDF
    This textbook introduces the basic concepts of database systems. These concepts are presented through numerous examples in modeling and design. The material in this book is geared to an introductory course in database systems offered at the junior or senior level of Computer Science. It could also be used in a first year graduate course in database systems, focusing on a selection of the advanced topics in the latter chapters

    The 1990 Goddard Conference on Space Applications of Artificial Intelligence

    Get PDF
    The papers presented at the 1990 Goddard Conference on Space Applications of Artificial Intelligence are given. The purpose of this annual conference is to provide a forum in which current research and development directed at space applications of artificial intelligence can be presented and discussed. The proceedings fall into the following areas: Planning and Scheduling, Fault Monitoring/Diagnosis, Image Processing and Machine Vision, Robotics/Intelligent Control, Development Methodologies, Information Management, and Knowledge Acquisition
    corecore