274 research outputs found

    Reasoning about Fuzzy Temporal and Spatial Information from the Web

    Get PDF

    Reasoning about fuzzy temporal and spatial information from the Web

    Get PDF

    Cognition-based approaches for high-precision text mining

    Get PDF
    This research improves the precision of information extraction from free-form text via the use of cognitive-based approaches to natural language processing (NLP). Cognitive-based approaches are an important, and relatively new, area of research in NLP and search, as well as linguistics. Cognitive approaches enable significant improvements in both the breadth and depth of knowledge extracted from text. This research has made contributions in the areas of a cognitive approach to automated concept recognition in. Cognitive approaches to search, also called concept-based search, have been shown to improve search precision. Given the tremendous amount of electronic text generated in our digital and connected world, cognitive approaches enable substantial opportunities in knowledge discovery. The generation and storage of electronic text is ubiquitous, hence opportunities for improved knowledge discovery span virtually all knowledge domains. While cognition-based search offers superior approaches, challenges exist due to the need to mimic, even in the most rudimentary way, the extraordinary powers of human cognition. This research addresses these challenges in the key area of a cognition-based approach to automated concept recognition. In addition it resulted in a semantic processing system framework for use in applications in any knowledge domain. Confabulation theory was applied to the problem of automated concept recognition. This is a relatively new theory of cognition using a non-Bayesian measure, called cogency, for predicting the results of human cognition. An innovative distance measure derived from cogent confabulation and called inverse cogency, to rank order candidate concepts during the recognition process. When used with a multilayer perceptron, it improved the precision of concept recognition by 5% over published benchmarks. Additional precision improvements are anticipated. These research steps build a foundation for cognition-based, high-precision text mining. Long-term it is anticipated that this foundation enables a cognitive-based approach to automated ontology learning. Such automated ontology learning will mimic human language cognition, and will, in turn, enable the practical use of cognitive-based approaches in virtually any knowledge domain --Abstract, page iii

    Personalizable Knowledge Integration

    Get PDF
    Large repositories of data are used daily as knowledge bases (KBs) feeding computer systems that support decision making processes, such as in medical or financial applications. Unfortunately, the larger a KB is, the harder it is to ensure its consistency and completeness. The problem of handling KBs of this kind has been studied in the AI and databases communities, but most approaches focus on computing answers locally to the KB, assuming there is some single, epistemically correct solution. It is important to recognize that for some applications, as part of the decision making process, users consider far more knowledge than that which is contained in the knowledge base, and that sometimes inconsistent data may help in directing reasoning; for instance, inconsistency in taxpayer records can serve as evidence of a possible fraud. Thus, the handling of this type of data needs to be context-sensitive, creating a synergy with the user in order to build useful, flexible data management systems. Inconsistent and incomplete information is ubiquitous and presents a substantial problem when trying to reason about the data: how can we derive an adequate model of the world, from the point of view of a given user, from a KB that may be inconsistent or incomplete? In this thesis we argue that in many cases users need to bring their application-specific knowledge to bear in order to inform the data management process. Therefore, we provide different approaches to handle, in a personalized fashion, some of the most common issues that arise in knowledge management. Specifically, we focus on (1) inconsistency management in relational databases, general knowledge bases, and a special kind of knowledge base designed for news reports; (2) management of incomplete information in the form of different types of null values; and (3) answering queries in the presence of uncertain schema matchings. We allow users to define policies to manage both inconsistent and incomplete information in their application in a way that takes both the user's knowledge of his problem, and his attitude to error/risk, into account. Using the frameworks and tools proposed here, users can specify when and how they want to manage/solve the issues that arise due to inconsistency and incompleteness in their data, in the way that best suits their needs

    The design and implementation of fuzzy query processing on sensor networks

    Get PDF
    Sensor nodes and Wireless Sensor Networks (WSN) enable observation of the physical world in unprecedented levels of granularity. A growing number of environmental monitoring applications are being designed to leverage data collection features of WSN, increasing the need for efficient data management techniques and for comparative analysis of various data management techniques. My research leverages aspects of fuzzy database, specifically fuzzy data representation and fuzzy or flexible queries to improve upon the efficiency of existing data management techniques by exploiting the inherent uncertainty of the data collected by WSN. Herein I present my research contributions. I provide classification of WSN middleware to illustrate varying approaches to data management for WSN and identify a need to better handle the uncertainty inherent in data collected from physical environments and to take advantage of the imprecision of the data to increase the efficiency of WSN by requiring less information be transmitted to adequately answer queries posed by WSN monitoring applications. In this dissertation, I present a novel approach to querying WSN, in which semantic knowledge about sensor attributes is represented as fuzzy terms. I present an enhanced simulation environment that supports more flexible and realistic analysis by using cellular automata models to separately model the deployed WSN and the underlying physical environment. Simulation experiments are used to evaluate my fuzzy query approach for environmental monitoring applications. My analysis shows that using fuzzy queries improves upon other data management techniques by reducing the amount of data that needs to be collected to accurately satisfy application requests. This reduction in data transmission results in increased battery life within sensors, an important measure of cost and performance for WSN applications

    CBR and MBR techniques: review for an application in the emergencies domain

    Get PDF
    The purpose of this document is to provide an in-depth analysis of current reasoning engine practice and the integration strategies of Case Based Reasoning and Model Based Reasoning that will be used in the design and development of the RIMSAT system. RIMSAT (Remote Intelligent Management Support and Training) is a European Commission funded project designed to: a.. Provide an innovative, 'intelligent', knowledge based solution aimed at improving the quality of critical decisions b.. Enhance the competencies and responsiveness of individuals and organisations involved in highly complex, safety critical incidents - irrespective of their location. In other words, RIMSAT aims to design and implement a decision support system that using Case Base Reasoning as well as Model Base Reasoning technology is applied in the management of emergency situations. This document is part of a deliverable for RIMSAT project, and although it has been done in close contact with the requirements of the project, it provides an overview wide enough for providing a state of the art in integration strategies between CBR and MBR technologies.Postprint (published version

    Improving Model Finding for Integrated Quantitative-qualitative Spatial Reasoning With First-order Logic Ontologies

    Get PDF
    Many spatial standards are developed to harmonize the semantics and specifications of GIS data and for sophisticated reasoning. All these standards include some types of simple and complex geometric features, and some of them incorporate simple mereotopological relations. But the relations as used in these standards, only allow the extraction of qualitative information from geometric data and lack formal semantics that link geometric representations with mereotopological or other qualitative relations. This impedes integrated reasoning over qualitative data obtained from geometric sources and “native” topological information – for example as provided from textual sources where precise locations or spatial extents are unknown or unknowable. To address this issue, the first contribution in this dissertation is a first-order logical ontology that treats geometric features (e.g. polylines, polygons) and relations between them as specializations of more general types of features (e.g. any kind of 2D or 1D features) and mereotopological relations between them. Key to this endeavor is the use of a multidimensional theory of space wherein, unlike traditional logical theories of mereotopology (like RCC), spatial entities of different dimensions can co-exist and be related. However terminating or tractable reasoning with such an expressive ontology and potentially large amounts of data is a challenging AI problem. Model finding tools used to verify FOL ontologies with data usually employ a SAT solver to determine the satisfiability of the propositional instantiations (SAT problems) of the ontology. These solvers often experience scalability issues with increasing number of objects and size and complexity of the ontology, limiting its use to ontologies with small signatures and building small models with less than 20 objects. To investigate how an ontology influences the size of its SAT translation and consequently the model finder’s performance, we develop a formalization of FOL ontologies with data. We theoretically identify parameters of an ontology that significantly contribute to the dramatic growth in size of the SAT problem. The search space of the SAT problem is exponential in the signature of the ontology (the number of predicates in the axiomatization and any additional predicates from skolemization) and the number of distinct objects in the model. Axiomatizations that contain many definitions lead to large number of SAT propositional clauses. This is from the conversion of biconditionals to clausal form. We therefore postulate that optional definitions are ideal sentences that can be eliminated from an ontology to boost model finder’s performance. We then formalize optional definition elimination (ODE) as an FOL ontology preprocessing step and test the simplification on a set of spatial benchmark problems to generate smaller SAT problems (with fewer clauses and variables) without changing the satisfiability and semantic meaning of the problem. We experimentally demonstrate that the reduction in SAT problem size also leads to improved model finding with state-of-the-art model finders, with speedups of 10-99%. Altogether, this dissertation improves spatial reasoning capabilities using FOL ontologies – in terms of a formal framework for integrated qualitative-geometric reasoning, and specific ontology preprocessing steps that can be built into automated reasoners to achieve better speedups in model finding times, and scalability with moderately-sized datasets

    A process-oriented data model for fuzzy spatial objects

    Get PDF
    The complexity of the natural environment, its polythetic and dynamic character, requires appropriate new methods to represent it in GISs, if only because in the past there has been a tendency to force reality into sharp and static objects. A more generalized spatio-temporal data model is required to deal with fuzziness and dynamics of objects. This need is the motivation behind the research reported in this thesis. In particular, the objective of this research was to develop a spatio-temporal data model for objects with fuzzy spatial extent.This thesis discusses three aspects related to achieving this objective:identification of fuzzy objects,detection of dynamic changes in fuzzy objects, andrepresentation of objects and their dynamics in a spatio-temporal data model.For the identification of fuzzy objects, a six-step procedure was proposed to extract objects from field observation data: sampling, interpolation, classification, segmentation, merging and identification. The uncertainties involved in these six steps were investigated and their effect on the mapped objects was analyzed. Three fuzzy object models were proposed to represent fuzzy objects of different application contexts. The concepts of conditional spatial extent, conditional boundary and transition zones of fuzzy objects were put forward and formalized based upon the formal data structure (FDS). In this procedure, uncertainty was transferred from thematic aspects to geometric aspects of objects, i.e. the existential uncertainty was converted to extensional uncertainty. The spatial effect of uncertainty in thematic aspect was expressed by the relationship between uncertainty of a cell belonging to the spatial extent of an object and the uncertainty of the cell belonging to classes.To detect dynamic changes in fuzzy objects, a method was proposed to identify objects and their state transitions from fuzzy spatial extents (regions) at different epochs. Similarity indicators of fuzzy regions were calculated based upon overlap between regions at consecutive epochs. Different combinations of indicator values imply different relationships between regions. Regions that were very similar represent the consecutive states of one object. By linking the regions, the historic lifelines of objects are built automatically. Then the relationship between regions became the relationship or interactions between objects, which were expressed in terms of processes, such as shift, merge or split. By comparing the spatial extents of objects at consecutive epochs, the change of objects was detected. The uncertainty of the change was analyzed by a series of change maps at different certainty levels. These can provide decision makers with more accurate information about change.For the third, and last, a process-oriented spatio-temporal data model was proposed to represent change and interaction of objects. The model was conceptually designed based upon the formalized representation of state and process of objects and was represented by a star-styled extended entity relationship, which I have called the Star Model. The conceptual design of the Star Model was translated into a relational logical design since many commercial relational database management systems are available. A prototype of the process-oriented spatio-temporal data model was implemented in ArcView based upon the case of Ameland. The user interface and queries of the prototype were developed using Avenue, the programming language of ArcView.The procedure of identification of fuzzy objects, which extracts fuzzy object data from field observations, unifies the existing field-oriented and object-oriented approaches. Therefore a generalized object concept - object with fuzzy spatial extent - has been developed. This concept links the object-oriented and the field-oriented characteristics of natural phenomena. The objects have conditional boundaries, representing their object characteristics; the interiors of the objects have field properties, representing their gradual and continuous distribution. Furthermore, the concept can handle both fuzzy and crisp objects. In the fuzzy object case, the objects have fuzzy transition or boundary zones, in which conditional boundaries may be defined; whereas crisp objects can be considered as a special case, i.e. there are sharp boundaries for crisp objects. Beyond that, both the boundary-oriented approach and the pixel-oriented approach of object extraction can use this generalized object concept, since the uncertainties of objects are expressed in the formal data structures (FDSs), which is applicable for either approach.The proposed process-oriented spatio-temporal data model is a general one, from which other models can be derived. It can support analysis and queries of time series data from varying perspectives through location-oriented, time-oriented, feature-oriented and process-oriented queries, in order to understand the behavior of dynamic spatial complexes of natural phenomena. Multi-strands of time can also be generated in this Star Model, each representing the (spatio-temporal) lifeline of an object. The model can represent dynamic processes affecting the spatial and thematic aspects of individual objects and object complexes. Because the model explicitly stores change (process) relative to time, procedures for answering queries relating to temporal relationships, as well as analytical tasks for comparing different sequences of change, are facilitated.The research findings in this thesis contribute theoretically and practically to the development of spatio-temporal data models for objects with fuzzy spatial extent.</p

    North American Fuzzy Logic Processing Society (NAFIPS 1992), volume 2

    Get PDF
    This document contains papers presented at the NAFIPS '92 North American Fuzzy Information Processing Society Conference. More than 75 papers were presented at this Conference, which was sponsored by NAFIPS in cooperation with NASA, the Instituto Tecnologico de Morelia, the Indian Society for Fuzzy Mathematics and Information Processing (ISFUMIP), the Instituto Tecnologico de Estudios Superiores de Monterrey (ITESM), the International Fuzzy Systems Association (IFSA), the Japan Society for Fuzzy Theory and Systems, and the Microelectronics and Computer Technology Corporation (MCC). The fuzzy set theory has led to a large number of diverse applications. Recently, interesting applications have been developed which involve the integration of fuzzy systems with adaptive processes such a neural networks and genetic algorithms. NAFIPS '92 was directed toward the advancement, commercialization, and engineering development of these technologies

    New Trends in Neutrosophic Theory and Applications Volume II

    Get PDF
    Neutrosophic set has been derived from a new branch of philosophy, namely Neutrosophy. Neutrosophic set is capable of dealing with uncertainty, indeterminacy and inconsistent information. Neutrosophic set approaches are suitable to modeling problems with uncertainty, indeterminacy and inconsistent information in which human knowledge is necessary, and human evaluation is needed. Neutrosophic set theory was proposed in 1998 by Florentin Smarandache, who also developed the concept of single valued neutrosophic set, oriented towards real world scientific and engineering applications. Since then, the single valued neutrosophic set theory has been extensively studied in books and monographs introducing neutrosophic sets and its applications, by many authors around the world. Also, an international journal - Neutrosophic Sets and Systems started its journey in 2013. Single valued neutrosophic sets have found their way into several hybrid systems, such as neutrosophic soft set, rough neutrosophic set, neutrosophic bipolar set, neutrosophic expert set, rough bipolar neutrosophic set, neutrosophic hesitant fuzzy set, etc. Successful applications of single valued neutrosophic sets have been developed in multiple criteria and multiple attribute decision making. This second volume collects original research and application papers from different perspectives covering different areas of neutrosophic studies, such as decision making, graph theory, image processing, probability theory, topology, and some theoretical papers. This volume contains four sections: DECISION MAKING, NEUTROSOPHIC GRAPH THEORY, IMAGE PROCESSING, ALGEBRA AND OTHER PAPERS. First paper (Pu Ji, Peng-fei Cheng, Hongyu Zhang, Jianqiang Wang. Interval valued neutrosophic Bonferroni mean operators and the application in the selection of renewable energy) aims to construct selection approaches for renewable energy considering the interrelationships among criteria. To do that, Bonferroni mean (BM) and geometric BM (GBM) are employed
    corecore