1,920 research outputs found
Towards Modeling Conceptual Dependency Primitives with Image Schema Logic
Conceptual Dependency (CD) primitives and Image Schemas (IS) share a common goal of grounding symbols of natural language in a representation that allows for automated semantic interpretation. Both seek to establish a connection between high-level conceptualizations in natural language and abstract cognitive building blocks. Some previous approaches have established a CD-IS correspondence. In this paper, we build on this correspondence in order to apply a logic designed for image schemas to selected CD primitives with the goal of formally taking account of the CD inventory. The logic draws from Region Connection Calculus (RCC-8), Qualitative Trajectory Calculus (QTC), Cardinal Directions and Linear Temporal Logic (LTL). One of the primary premises of CD is a minimalist approach to its inventory of primitives, that is, it seeks to express natural language contents in an abstract manner with as few primitives as possible. In a formal analysis of physical primitives of CD we found a potential reduction since some primitives can be expressed as special cases of others
Ontology for pixel processing
For all kinds of output devices, such as monitors, printers etc, the most important thing is to show the right information to the user. Pixel is the basic element both on screen and materials printed with. And, as a result pixel processing is the basic technique to make the output correct, precise, and suitable to use on different occasions. Pixel processing solves operations on each pixel of the image, which is for the pixel matrices of that image, so that the image would have different appearance. Ontology is about the exact description of things and their relationships. It is an old study of philosophy from ancient Greece. As the study of artificial intelligence keeps growing, the concept of ontology has been in use more and more in the formalization of knowledge in terms of classes, properties, instances and relations [1]. This paper mainly discusses how to build ontology of pixel processing with OWL. Actually, it is focused on how to describe pixel processing and its functions or operations in an understandable way by computer. With such description, it is possible to improve the development of pixel processing and the sharing of its knowledge both between people and machines. That is from the Natural Language Processing point of view. And also, in the future, it provides a base for intelligent agent to implement pixel processing by understanding such kind of definition and description directly through its knowledge base built up with such ontology. In other words, that may realize the automatic program or program analysis
Recommended from our members
Knowledge based approach to flexible workflow management systems
This thesis was submitted for the degree of Doctor of Philosophy and awarded the Korea Advanced Institute of Science and Technology (KAIST).Today's business environments are characterized by dynamic and uncertain environments. In order to effectively support business processes in such contexts, workflow management systems must be able to adapt themselves effectively. In this dissertation, the workflow is redefined in
concept and represented with a set of business rules. Business rules play a central role in
organizational workflows in context of cooperation among actors. To achieve business goals, they constrain the flow of works, use of resources, and responsibility mapping between tasks and actors using role concept. Business rules are explicitly modeled in the Knowledge-based Workflow Model (KWM) using frames.
To increase the adaptability of workflow management system, KWM has several distinctive
features. First, it increases expressiveness of workflow model so that exception handling rules
and responsibility mapping rules between tasks and actors as well as task scheduling rules are
explicitly modeled. Secondly, formal definition of KWM enables one to define and to analyze correctness of workflow schema. Knowledge-based approach enables more powerful analysis on workflow schema including checking consistency and compactness of routing rules as well as terminality of a workflow. Thirdly, providing change propagation mechanism which assures
correctness of workflow after the modification of workflow schema increases adaptability.
Change propagation rules for the modification primitives are provided to manage workflow
evolution. On the other hand, metarules that control rules in KWM are used to handle exceptions that occur in a running workflow instance. Workflow participants can easily change workflow schema of a workflow instance with the support of extra rules and a metarule.
Based on KWM, K-WFMS (Knowledge-based WorkFlow Management System) has been implemented in client/server architecture. Inference shell of knowledge-based systems is employed for enactment of business rules and integrated with database systems. From a real application based on the KWM architecture, it has been shown that system performance can increase notably by reducing the number of rules and facts that are used in the course of workflow enactment
Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design
The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface
A Process Modelling Framework Based on Point Interval Temporal Logic with an Application to Modelling Patient Flows
This thesis considers an application of a temporal theory to describe and model the patient journey in the hospital accident and emergency (A&E) department. The aim is to introduce a generic but dynamic method applied to any setting, including healthcare. Constructing a consistent process model can be instrumental in streamlining healthcare issues. Current process modelling techniques used in healthcare such as flowcharts, unified modelling language activity diagram (UML AD), and business process modelling notation (BPMN) are intuitive and imprecise. They cannot fully capture the complexities of the types of activities and the full extent of temporal constraints to an extent where one could reason about the flows. Formal approaches such as Petri have also been reviewed to investigate their applicability to the healthcare domain to model processes.
Additionally, to schedule patient flows, current modelling standards do not offer any formal mechanism, so healthcare relies on critical path method (CPM) and program evaluation review technique (PERT), that also have limitations, i.e. finish-start barrier. It is imperative to specify the temporal constraints between the start and/or end of a process, e.g., the beginning of a process A precedes the start (or end) of a process B. However, these approaches failed to provide us with a mechanism for handling these temporal situations. If provided, a formal representation can assist in effective knowledge representation and quality enhancement concerning a process. Also, it would help in uncovering complexities of a system and assist in modelling it in a consistent way which is not possible with the existing modelling techniques.
The above issues are addressed in this thesis by proposing a framework that would provide a knowledge base to model patient flows for accurate representation based on point interval temporal logic (PITL) that treats point and interval as primitives. These objects would constitute the knowledge base for the formal description of a system. With the aid of the inference mechanism of the temporal theory presented here, exhaustive temporal constraints derived from the proposed axiomatic systemâ components serves as a knowledge base.
The proposed methodological framework would adopt a model-theoretic approach in which a theory is developed and considered as a model while the corresponding instance is considered as its application. Using this approach would assist in identifying core components of the system and their precise operation representing a real-life domain deemed suitable to the process modelling issues specified in this thesis. Thus, I have evaluated the modelling standards for their most-used terminologies and constructs to identify their key components. It will also assist in the generalisation of the critical terms (of process modelling standards) based on their ontology. A set of generalised terms proposed would serve as an enumeration of the theory and subsume the core modelling elements of the process modelling standards. The catalogue presents a knowledge base for the business and healthcare domains, and its components are formally defined (semantics). Furthermore, a resolution theorem-proof is used to show the structural features of the theory (model) to establish it is sound and complete.
After establishing that the theory is sound and complete, the next step is to provide the instantiation of the theory. This is achieved by mapping the core components of the theory to their corresponding instances. Additionally, a formal graphical tool termed as point graph (PG) is used to visualise the cases of the proposed axiomatic system. PG facilitates in modelling, and scheduling patient flows and enables analysing existing models for possible inaccuracies and inconsistencies supported by a reasoning mechanism based on PITL. Following that, a transformation is developed to map the core modelling components of the standards into the extended PG (PG*) based on the semantics presented by the axiomatic system.
A real-life case (from the Kingâs College hospital accident and emergency (A&E) departmentâs trauma patient pathway) is considered to validate the framework. It is divided into three patient flows to depict the journey of a patient with significant trauma, arriving at A&E, undergoing a procedure and subsequently discharged. Their staff relied upon the UML-AD and BPMN to model the patient flows. An evaluation of their representation is presented to show the shortfalls of the modelling standards to model patient flows. The last step is to model these patient flows using the developed approach, which is supported by enhanced reasoning and scheduling
A schema-based P2P network to enable publish-subscribe for multimedia content in open hypermedia systems
Open Hypermedia Systems (OHS) aim to provide efficient dissemination, adaptation and integration of hyperlinked multimedia resources. Content available in Peer-to-Peer (P2P) networks could add significant value to OHS provided that challenges for efficient discovery and prompt delivery of rich and up-to-date content are successfully addressed. This paper proposes an architecture that enables the operation of OHS over a P2P overlay network of OHS servers based on semantic annotation of (a) peer OHS servers and of (b) multimedia resources that can be obtained through the link services of the OHS. The architecture provides efficient resource discovery. Semantic query-based subscriptions over this P2P network can enable access to up-to-date content, while caching at certain peers enables prompt delivery of multimedia content. Advanced query resolution techniques are employed to match different parts of subscription queries (subqueries). These subscriptions can be shared among different interested peers, thus increasing the efficiency of multimedia content dissemination
Mapping-equivalence and oid-equivalence of single-function object-creating conjunctive queries
Conjunctive database queries have been extended with a mechanism for object
creation to capture important applications such as data exchange, data
integration, and ontology-based data access. Object creation generates new
object identifiers in the result, that do not belong to the set of constants in
the source database. The new object identifiers can be also seen as Skolem
terms. Hence, object-creating conjunctive queries can also be regarded as
restricted second-order tuple-generating dependencies (SO tgds), considered in
the data exchange literature.
In this paper, we focus on the class of single-function object-creating
conjunctive queries, or sifo CQs for short. We give a new characterization for
oid-equivalence of sifo CQs that is simpler than the one given by Hull and
Yoshikawa and places the problem in the complexity class NP. Our
characterization is based on Cohen's equivalence notions for conjunctive
queries with multiplicities. We also solve the logical entailment problem for
sifo CQs, showing that also this problem belongs to NP. Results by Pichler et
al. have shown that logical equivalence for more general classes of SO tgds is
either undecidable or decidable with as yet unknown complexity upper bounds.Comment: This revised version has been accepted on 11 January 2016 for
publication in The VLDB Journa
Ontological foundations for structural conceptual models
In this thesis, we aim at contributing to the theory of conceptual modeling and ontology representation. Our main objective here is to provide ontological foundations for the most fundamental concepts in conceptual modeling. These foundations comprise a number of ontological theories, which are built on established work on philosophical ontology, cognitive psychology, philosophy of language and linguistics. Together these theories amount to a system of categories and formal relations known as a foundational ontolog
Symbol Emergence in Robotics: A Survey
Humans can learn the use of language through physical interaction with their
environment and semiotic communication with other people. It is very important
to obtain a computational understanding of how humans can form a symbol system
and obtain semiotic skills through their autonomous mental development.
Recently, many studies have been conducted on the construction of robotic
systems and machine-learning methods that can learn the use of language through
embodied multimodal interaction with their environment and other systems.
Understanding human social interactions and developing a robot that can
smoothly communicate with human users in the long term, requires an
understanding of the dynamics of symbol systems and is crucially important. The
embodied cognition and social interaction of participants gradually change a
symbol system in a constructive manner. In this paper, we introduce a field of
research called symbol emergence in robotics (SER). SER is a constructive
approach towards an emergent symbol system. The emergent symbol system is
socially self-organized through both semiotic communications and physical
interactions with autonomous cognitive developmental agents, i.e., humans and
developmental robots. Specifically, we describe some state-of-art research
topics concerning SER, e.g., multimodal categorization, word discovery, and a
double articulation analysis, that enable a robot to obtain words and their
embodied meanings from raw sensory--motor information, including visual
information, haptic information, auditory information, and acoustic speech
signals, in a totally unsupervised manner. Finally, we suggest future
directions of research in SER.Comment: submitted to Advanced Robotic
- âŠ