10,832 research outputs found

    Simulation of complex environments:the Fuzzy Cognitive Agent

    Get PDF
    The world is becoming increasingly competitive by the action of liberalised national and global markets. In parallel these markets have become increasingly complex making it difficult for participants to optimise their trading actions. In response, many differing computer simulation techniques have been investigated to develop either a deeper understanding of these evolving markets or to create effective system support tools. In this paper we report our efforts to develop a novel simulation platform using fuzzy cognitive agents (FCA). Our approach encapsulates fuzzy cognitive maps (FCM) generated on the Matlab Simulink platform within commercially available agent software. We firstly present our implementation of Matlab Simulink FCMs and then show how such FCMs can be integrated within a conceptual FCA architecture. Finally we report on our efforts to realise an FCA by the integration of a Matlab Simulink based FCM with the Jack Intelligent Agent Toolkit

    Designing Software Architectures As a Composition of Specializations of Knowledge Domains

    Get PDF
    This paper summarizes our experimental research and software development activities in designing robust, adaptable and reusable software architectures. Several years ago, based on our previous experiences in object-oriented software development, we made the following assumption: ‘A software architecture should be a composition of specializations of knowledge domains’. To verify this assumption we carried out three pilot projects. In addition to the application of some popular domain analysis techniques such as use cases, we identified the invariant compositional structures of the software architectures and the related knowledge domains. Knowledge domains define the boundaries of the adaptability and reusability capabilities of software systems. Next, knowledge domains were mapped to object-oriented concepts. We experienced that some aspects of knowledge could not be directly modeled in terms of object-oriented concepts. In this paper we describe our approach, the pilot projects, the experienced problems and the adopted solutions for realizing the software architectures. We conclude the paper with the lessons that we learned from this experience

    A distributed agent architecture for real-time knowledge-based systems: Real-time expert systems project, phase 1

    Get PDF
    We propose a distributed agent architecture (DAA) that can support a variety of paradigms based on both traditional real-time computing and artificial intelligence. DAA consists of distributed agents that are classified into two categories: reactive and cognitive. Reactive agents can be implemented directly in Ada to meet hard real-time requirements and be deployed on on-board embedded processors. A traditional real-time computing methodology under consideration is the rate monotonic theory that can guarantee schedulability based on analytical methods. AI techniques under consideration for reactive agents are approximate or anytime reasoning that can be implemented using Bayesian belief networks as in Guardian. Cognitive agents are traditional expert systems that can be implemented in ART-Ada to meet soft real-time requirements. During the initial design of cognitive agents, it is critical to consider the migration path that would allow initial deployment on ground-based workstations with eventual deployment on on-board processors. ART-Ada technology enables this migration while Lisp-based technologies make it difficult if not impossible. In addition to reactive and cognitive agents, a meta-level agent would be needed to coordinate multiple agents and to provide meta-level control

    Fuzzy Dynamic Discrimination Algorithms for Distributed Knowledge Management Systems

    Get PDF
    A reduction of the algorithmic complexity of the fuzzy inference engine has the following property: the inputs (the fuzzy rules and the fuzzy facts) can be divided in two parts, one being relatively constant for a long a time (the fuzzy rule or the knowledge model) when it is compared to the second part (the fuzzy facts) for every inference cycle. The occurrence of certain transformations over the constant part makes sense, in order to decrease the solution procurement time, in the case that the second part varies, but it is known at certain moments in time. The transformations attained in advance are called pre-processing or knowledge compilation. The use of variables in a Business Rule Management System knowledge representation allows factorising knowledge, like in classical knowledge based systems. The language of the first-degree predicates facilitates the formulation of complex knowledge in a rigorous way, imposing appropriate reasoning techniques. It is, thus, necessary to define the description method of fuzzy knowledge, to justify the knowledge exploiting efficiency when the compiling technique is used, to present the inference engine and highlight the functional features of the pattern matching and the state space processes. This paper presents the main results of our project PR356 for designing a compiler for fuzzy knowledge, like Rete compiler, that comprises two main components: a static fuzzy discrimination structure (Fuzzy Unification Tree) and the Fuzzy Variables Linking Network. There are also presented the features of the elementary pattern matching process that is based on the compiled structure of fuzzy knowledge. We developed fuzzy discrimination algorithms for Distributed Knowledge Management Systems (DKMSs). The implementations have been elaborated in a prototype system FRCOM (Fuzzy Rule COMpiler).Fuzzy Unification Tree, Dynamic Discrimination of Fuzzy Sets, DKMS, FRCOM

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    An overview of decision table literature 1982-1995.

    Get PDF
    This report gives an overview of the literature on decision tables over the past 15 years. As much as possible, for each reference, an author supplied abstract, a number of keywords and a classification are provided. In some cases own comments are added. The purpose of these comments is to show where, how and why decision tables are used. The literature is classified according to application area, theoretical versus practical character, year of publication, country or origin (not necessarily country of publication) and the language of the document. After a description of the scope of the interview, classification results and the classification by topic are presented. The main body of the paper is the ordered list of publications with abstract, classification and comments.

    CBR and MBR techniques: review for an application in the emergencies domain

    Get PDF
    The purpose of this document is to provide an in-depth analysis of current reasoning engine practice and the integration strategies of Case Based Reasoning and Model Based Reasoning that will be used in the design and development of the RIMSAT system. RIMSAT (Remote Intelligent Management Support and Training) is a European Commission funded project designed to: a.. Provide an innovative, 'intelligent', knowledge based solution aimed at improving the quality of critical decisions b.. Enhance the competencies and responsiveness of individuals and organisations involved in highly complex, safety critical incidents - irrespective of their location. In other words, RIMSAT aims to design and implement a decision support system that using Case Base Reasoning as well as Model Base Reasoning technology is applied in the management of emergency situations. This document is part of a deliverable for RIMSAT project, and although it has been done in close contact with the requirements of the project, it provides an overview wide enough for providing a state of the art in integration strategies between CBR and MBR technologies.Postprint (published version

    Knowledge-based processing for aircraft flight control

    Get PDF
    This Contractor Report documents research in Intelligent Control using knowledge-based processing in a manner dual to methods found in the classic stochastic decision, estimation, and control discipline. Such knowledge-based control has also been called Declarative, and Hybid. Software architectures were sought, employing the parallelism inherent in modern object-oriented modeling and programming. The viewpoint adopted was that Intelligent Control employs a class of domain-specific software architectures having features common over a broad variety of implementations, such as management of aircraft flight, power distribution, etc. As much attention was paid to software engineering issues as to artificial intelligence and control issues. This research considered that particular processing methods from the stochastic and knowledge-based worlds are duals, that is, similar in a broad context. They provide architectural design concepts which serve as bridges between the disparate disciplines of decision, estimation, control, and artificial intelligence. This research was applied to the control of a subsonic transport aircraft in the airport terminal area
    corecore