361,211 research outputs found
Concurrent Lexicalized Dependency Parsing: A Behavioral View on ParseTalk Events
The behavioral specification of an object-oriented grammar model is
considered. The model is based on full lexicalization, head-orientation via
valency constraints and dependency relations, inheritance as a means for
non-redundant lexicon specification, and concurrency of computation. The
computation model relies upon the actor paradigm, with concurrency entering
through asynchronous message passing between actors. In particular, we here
elaborate on principles of how the global behavior of a lexically distributed
grammar and its corresponding parser can be specified in terms of event type
networks and event networks, resp.Comment: 68kB, 5pages Postscrip
A spatiotemporal object-oriented data model for landslides (LOOM)
LOOM (landslide object-oriented model) is here presented as a data structure for landslide inventories based on the object-oriented paradigm. It aims at the effective storage, in a single dataset, of the complex spatial and temporal relations between landslides recorded and mapped in an area and at their manipulation. Spatial relations are handled through a hierarchical classification based on topological rules and two levels of aggregation are defined: (i) landslide complexes, grouping spatially connected landslides of the same type, and (ii) landslide systems, merging landslides of any type sharing a spatial connection. For the aggregation procedure, a minimal functional interaction between landslide objects has been defined as a spatial overlap between objects. Temporal characterization of landslides is achieved by assigning to each object an exact date or a time range for its occurrence, integrating both the time frame and the event-based approaches. The sum of spatial integrity and temporal characterization ensures the storage of vertical relations between landslides, so that the superimposition of events can be easily retrieved querying the temporal dataset. The here proposed methodology for landslides inventorying has been tested on selected case studies in the Cilento UNESCO Global Geopark (Italy). We demonstrate that the proposed LOOM model avoids data fragmentation or redundancy and topological inconsistency between the digital data and the real-world features. This application revealed to be powerful for the reconstruction of the gravity-induced deformation history of hillslopes, thus for the prediction of their evolution
Integrating Refinement into Software Development Tools
AbstractIt is a challenge for automatic tool support to formal design by refinement transformations. In this paper, we bring this matter to the attention of the research community and discuss a component-based model transformational approach for integrating refinement into software development tools. Models, their consistency and correctness, in an object-oriented and component-based development process are defined in rCOS, that is a refinement calculus recently developed at UNU-IIST. Correctness preserving transformations between models are formalized and proved as refinement rules in rCOS. In this paper, we will discuss on how these transformations can be implemented in the relations language of Query/View/Transformation (QVT) standardized by OMG
Object-oriented implementation of Prolog
Logic programming is a discipline of describing problems in high-level abstraction
by separating logic from control. Conventional Prolog interpretation or compilation
models take a procedural view of Prolog programs. A description of interpretation
models was summarized by Bruynooghe[Bru82] and a well-known compilation model
was introduced by Warren[War83].
The goal of this study is to present an alternative approach to construct Prolog
execution model to tackle the complexities caused by conventional Prolog execution
models. By taking the advantage of object-oriented techniques, a new model - object-oriented
model is proposed. Instead of decomposing a given Prolog program into a
set of procedures, the model translates it into a collection of coordinated objects
which simulate components of the problem to be solved.
First, the object-oriented model is described in terms of the object base and
inference engine. The object base represents the components of Prolog programs
naturally with corresponding objects in terms of AND/OR network. The inference
engine, which specifies the operational behaviour of the objects, is embedded in the
object base and independent of any specific Prolog program.
Secondly, implementation issues of a Prolog system based on the object-oriented
model are presented. A transformation program is developed to translate any given
Prolog program into a set of objects and assign the corresponding relations among
them. The implementation of the inference engine adopts Robinson’s resolution
[Rob79] which consists of two major algorithms; unification and backtracking.
Finally, the first parameter hashing optimization and a uniform interface to adopt new built-in predicates are addressed to show the extensibility of proposed Prolog
system.
An experimental object-oriented Prolog system, LU-Prolog, has been developed
based on the proposed model. An evaluation of the performance of LU-Prolog and
its future directions are also presented in this thesis
Distance functional dependencies in the presence of complex values
Distance functional dependencies (dFDs) have been introduced in the context of the relational data model as a generalisation of error-robust functional dependencies (erFDs). An erFD is a dependency that still holds, if errors are introduced into a relation, which cause the violation of an original functional dependency. A dFD with a distance d=2e+1 corresponds to an erFD with at most e errors in each tuple. Recently, an axiomatisation of dFDs has been obtained. Database theory, however, does no longer deal only with flat relations. Modern data models such as the higher-order Entity-Relationship model (HERM), object oriented datamodels (OODM), or the eXtensible Meakup Language (XML) provide constructors for complex values such as finite sets, multisets and lists. In this article, dFDs with complex values are investigated. Based on a generalisation of the HAmming distance for tuples to complex values, which exploits a lattice structure on subattributes, the major achievement is a finite axiomatisation of the new class of dependencies
Object-Oriented Dynamics Learning through Multi-Level Abstraction
Object-based approaches for learning action-conditioned dynamics has
demonstrated promise for generalization and interpretability. However, existing
approaches suffer from structural limitations and optimization difficulties for
common environments with multiple dynamic objects. In this paper, we present a
novel self-supervised learning framework, called Multi-level Abstraction
Object-oriented Predictor (MAOP), which employs a three-level learning
architecture that enables efficient object-based dynamics learning from raw
visual observations. We also design a spatial-temporal relational reasoning
mechanism for MAOP to support instance-level dynamics learning and handle
partial observability. Our results show that MAOP significantly outperforms
previous methods in terms of sample efficiency and generalization over novel
environments for learning environment models. We also demonstrate that learned
dynamics models enable efficient planning in unseen environments, comparable to
true environment models. In addition, MAOP learns semantically and visually
interpretable disentangled representations.Comment: Accepted to the Thirthy-Fourth AAAI Conference On Artificial
Intelligence (AAAI), 202
The role of functional prototyping within the KADS methodology : a thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Computer Science at Massey University
Knowledge-based systems have until recent times lacked a clear and complete methodology for their construction. KADS was the result of the early 1980's project (ESPRIT-I P1098) which had the aim of developing a comprehensive, commercially viable methodology for knowledge-based system construction. KADS has subsequently proved to be one of the more popular approaches, focusing on the modelling approach to knowledge based system development. One area of the KADS methodology that has not been examined to any great depth is that of model validation. Model validation is the process of ensuring that a derived model is an accurate representation of the domain from which it has been derived from. The two approaches which have been suggested for this purpose within the KADS framework are that of protocol analysis and functional prototyping. This project seeks to apply the second of these choices, that of functional prototyping, to the model of expertise created by da Silva (1994) for model validation purposes. The problem domain is that of farm management, under an joint program of research between the Computer Science, Information Systems and Agricultural Management departments of Massey University. The project took the model of expertise and created a knowledge representation model in compliance with the selected object-oriented paradigm. After this the creation of a functional prototype in a Microsoft Windows based PC environment took place, using Kappa-PC as the application development tool. The validation took place through a demonstration session to a number of domain experts. Conclusions drawn from the experience gained through the creation and use of the prototype are presented, outlining the reasons why functional prototyping was deemed to be an appropriate method for model validation
Implementing imperfect information in fuzzy databases
Information in real-world applications is often
vague, imprecise and uncertain. Ignoring the inherent imperfect
nature of real-world will undoubtedly introduce some deformation of human perception of real-world and may eliminate several
substantial information, which may be very useful in several
data-intensive applications. In database context, several fuzzy
database models have been proposed. In these works, fuzziness
is introduced at different levels. Common to all these proposals is
the support of fuzziness at the attribute level. This paper proposes
first a rich set of data types devoted to model the different kinds
of imperfect information. The paper then proposes a formal
approach to implement these data types. The proposed approach
was implemented within a relational object database model but it
is generic enough to be incorporated into other database models.ou
Composing Software from Multiple Concerns: A Model and Composition Anomalies
Constructing software from components is considered to be a key requirement for managing the complexity of software. Separation of concerns makes only sense if the realizations of these concerns can be composed together effectively into a working program. Various publications have shown that composability of software is far from trivial and fails when components express complex behavior such as constraints, synchronization and history-sensitiveness. We believe that to address the composability problems, we need to understand and define the situations where composition fails. To this aim, in this paper we (a) introduce a general model of multi-dimensional concern composition, and (b) define so-called composition anomalies
- …