2,654 research outputs found

    Steps Towards a Method for the Formal Modeling of Dynamic Objects

    Get PDF
    Fragments of a method to formally specify object-oriented models of a universe of discourse are presented. The task of finding such models is divided into three subtasks, object classification, event specification, and the specification of the life cycle of an object. Each of these subtasks is further subdivided, and for each of the subtasks heuristics are given that can aid the analyst in deciding how to represent a particular aspect of the real world. The main sources of inspiration are Jackson System Development, algebraic specification of data- and object types, and algebraic specification of processes

    Query Evaluation in Recursive Databases

    Get PDF

    New Architectural Models for Visibly Controllable Computing: The Relevance of Dynamic Object Oriented Architectures and Plan Based Computing Models

    Get PDF
    Traditionally, we've focussed on the question of how to make a system easy to code the first time, or perhaps on how to ease the system's continued evolution. But if we look at life cycle costs, then we must conclude that the important question is how to make a system easy to operate. To do this we need to make it easy for the operators to see what's going on and to then manipulate the system so that it does what it is supposed to. This is a radically different criterion for success. What makes a computer system visible and controllable? This is a difficult question, but it's clear that today's modern operating systems with nearly 50 million source lines of code are neither. Strikingly, the MIT Lisp Machine and its commercial successors provided almost the same functionality as today's mainstream sytsems, but with only 1 Million lines of code. This paper is a retrospective examination of the features of the Lisp Machine hardware and software system. Our key claim is that by building the Object Abstraction into the lowest tiers of the system, great synergy and clarity were obtained. It is our hope that this is a lesson that can impact tomorrow's designs. We also speculate on how the spirit of the Lisp Machine could be extended to include a comprehensive access control model and how new layers of abstraction could further enrich this model

    Query processing in temporal object-oriented databases

    Get PDF
    This PhD thesis is concerned with historical data management in the context of objectoriented databases. An extensible approach has been explored to processing temporal object queries within a uniform query framework. By the uniform framework, we mean temporal queries can be processed within the existing object-oriented framework that is extended from relational framework, by extending the existing query processing techniques and strategies developed for OODBs and RDBs. The unified model of OODBs and RDBs in UmSQL/X has been adopted as a basis for this purpose. A temporal object data model is thereby defined by incorporating a time dimension into this unified model of OODBs and RDBs to form temporal relational-like cubes but with the addition of aggregation and inheritance hierarchies. A query algebra, that accesses objects through these associations of aggregation, inheritance and timereference, is then defined as a general query model /language. Due to the extensive features of our data model and reducibility of the algebra, a layered structure of query processor is presented that provides a uniforrn framework for processing temporal object queries. Within the uniform framework, query transformation is carried out based on a set of transformation rules identified that includes the known relational and object rules plus those pertaining to the time dimension. To evaluate a temporal query involving a path with timereference, a strategy of decomposition is proposed. That is, evaluation of an enhanced path, which is defined to extend a path with time-reference, is decomposed by initially dividing the path into two sub-paths: one containing the time-stamped class that can be optimized by making use of the ordering information of temporal data and another an ordinary sub-path (without time-stamped classes) which can be further decomposed and evaluated using different algorithms. The intermediate results of traversing the two sub-paths are then joined together to create the query output. Algorithms for processing the decomposed query components, i. e., time-related operation algorithms, four join algorithms (nested-loop forward join, sort-merge forward join, nested-loop reverse join and sort-merge reverse join) and their modifications, have been presented with cost analysis and implemented with stream processing techniques using C++. Simulation results are also provided. Both cost analysis and simulation show the effects of time on the query processing algorithms: the join time cost is linearly increased with the expansion in the number of time-epochs (time-dimension in the case of a regular TS). It is also shown that using heuristics that make use of time information can lead to a significant time cost saving. Query processing with incomplete temporal data has also been discussed

    Workshop on Database Programming Languages

    Get PDF
    These are the revised proceedings of the Workshop on Database Programming Languages held at Roscoff, Finistère, France in September of 1987. The last few years have seen an enormous activity in the development of new programming languages and new programming environments for databases. The purpose of the workshop was to bring together researchers from both databases and programming languages to discuss recent developments in the two areas in the hope of overcoming some of the obstacles that appear to prevent the construction of a uniform database programming environment. The workshop, which follows a previous workshop held in Appin, Scotland in 1985, was extremely successful. The organizers were delighted with both the quality and volume of the submissions for this meeting, and it was regrettable that more papers could not be accepted. Both the stimulating discussions and the excellent food and scenery of the Brittany coast made the meeting thoroughly enjoyable. There were three main foci for this workshop: the type systems suitable for databases (especially object-oriented and complex-object databases,) the representation and manipulation of persistent structures, and extensions to deductive databases that allow for more general and flexible programming. Many of the papers describe recent results, or work in progress, and are indicative of the latest research trends in database programming languages. The organizers are extremely grateful for the financial support given by CRAI (Italy), Altaïr (France) and AT&T (USA). We would also like to acknowledge the organizational help provided by Florence Deshors, Hélène Gans and Pauline Turcaud of Altaïr, and by Karen Carter of the University of Pennsylvania

    Semantic networks

    Get PDF
    AbstractA semantic network is a graph of the structure of meaning. This article introduces semantic network systems and their importance in Artificial Intelligence, followed by I. the early background; II. a summary of the basic ideas and issues including link types, frame systems, case relations, link valence, abstraction, inheritance hierarchies and logic extensions; and III. a survey of ‘world-structuring’ systems including ontologies, causal link models, continuous models, relevance, formal dictionaries, semantic primitives and intersecting inference hierarchies. Speed and practical implementation are briefly discussed. The conclusion argues for a synthesis of relational graph theory, graph-grammar theory and order theory based on semantic primitives and multiple intersecting inference hierarchies

    Inductive Pattern Formation

    Get PDF
    With the extended computational limits of algorithmic recursion, scientific investigation is transitioning away from computationally decidable problems and beginning to address computationally undecidable complexity. The analysis of deductive inference in structure-property models are yielding to the synthesis of inductive inference in process-structure simulations. Process-structure modeling has examined external order parameters of inductive pattern formation, but investigation of the internal order parameters of self-organization have been hampered by the lack of a mathematical formalism with the ability to quantitatively define a specific configuration of points. This investigation addressed this issue of quantitative synthesis. Local space was developed by the Poincare inflation of a set of points to construct neighborhood intersections, defining topological distance and introducing situated Boolean topology as a local replacement for point-set topology. Parallel development of the local semi-metric topological space, the local semi-metric probability space, and the local metric space of a set of points provides a triangulation of connectivity measures to define the quantitative architectural identity of a configuration and structure independent axes of a structural configuration space. The recursive sequence of intersections constructs a probabilistic discrete spacetime model of interacting fields to define the internal order parameters of self-organization, with order parameters external to the configuration modeled by adjusting the morphological parameters of individual neighborhoods and the interplay of excitatory and inhibitory point sets. The evolutionary trajectory of a configuration maps the development of specific hierarchical structure that is emergent from a specific set of initial conditions, with nested boundaries signaling the nonlinear properties of local causative configurations. This exploration of architectural configuration space concluded with initial process-structure-property models of deductive and inductive inference spaces. In the computationally undecidable problem of human niche construction, an adaptive-inductive pattern formation model with predictive control organized the bipartite recursion between an information structure and its physical expression as hierarchical ensembles of artificial neural network-like structures. The union of architectural identity and bipartite recursion generates a predictive structural model of an evolutionary design process, offering an alternative to the limitations of cognitive descriptive modeling. The low computational complexity of these models enable them to be embedded in physical constructions to create the artificial life forms of a real-time autonomously adaptive human habitat

    HOOD : a Higher-Order Object-Oriented Database model and its implementation

    Get PDF
    Bibliography: pages 133-140.There is no accepted standard for the object-oriented database paradigm at present, which has led to different definitions of features and conformance requirements. HOOD is a Higher-Order Object-Oriented Database system which defines a meta-data model for specifying the requirements of an Object-Oriented Database, which provides uniformity and extensibility. From this specification and by making use of a comprehensive structure system, an exemplar or implementation model is defined. Among the constructs provided by the model are types, instances, objects, values, methods, base types, generic types and metatypes. The mechanisms of instantiation and subtyping allow for relationships between these constructs. Extensibility is provided in the model for types, base types, structures and methods. Uniformity is achieved by defining all constructs as instances and through the use of messages for all operations. There is only one form of object construct which provides persistence and identities. The complex values and extensibility of the model allow it to adapt in order to model the real world instead of adapting the real world to fit the model. We have implemented a subset of the structures and values defined in the model, provided persistence and identities for object, and included the various constructs mentioned above. The method language allows for the specification of methods, the passing of messages, and the use of complex values. The compiler performs type checking and resolution and generates instructions for an abstract machine which manipulates the database
    • …
    corecore