16,517 research outputs found
The VEX-93 environment as a hybrid tool for developing knowledge systems with different problem solving techniques
The paper describes VEX-93 as a hybrid environment for developing
knowledge-based and problem solver systems. It integrates methods and
techniques from artificial intelligence, image and signal processing and
data analysis, which can be mixed. Two hierarchical levels of reasoning
contains an intelligent toolbox with one upper strategic inference engine
and four lower ones containing specific reasoning models: truth-functional
(rule-based), probabilistic (causal networks), fuzzy (rule-based) and
case-based (frames). There are image/signal processing-analysis capabilities
in the form of programming languages with more than one hundred primitive
functions.
User-made programs are embeddable within knowledge basis, allowing the
combination of perception and reasoning. The data analyzer toolbox contains
a collection of numerical classification, pattern recognition and ordination
methods, with neural network tools and a data base query language at
inference engines's disposal.
VEX-93 is an open system able to communicate with external computer programs
relevant to a particular application. Metaknowledge can be used for
elaborate conclusions, and man-machine interaction includes, besides windows
and graphical interfaces, acceptance of voice commands and production of
speech output.
The system was conceived for real-world applications in general domains, but
an example of a concrete medical diagnostic support system at present under
completion as a cuban-spanish project is mentioned.
Present version of VEX-93 is a huge system composed by about one and half
millions of lines of C code and runs in microcomputers under Windows 3.1.Postprint (published version
Proceedings of the 4th field robot event 2006, Stuttgart/Hohenheim, Germany, 23-24th June 2006
Zeer uitgebreid verslag van het 4e Fieldrobotevent, dat gehouden werd op 23 en 24 juni 2006 in Stuttgart/Hohenhei
BlogForever D5.2: Implementation of Case Studies
This document presents the internal and external testing results for the BlogForever case studies. The evaluation of the BlogForever implementation process is tabulated under the most relevant themes and aspects obtained within the testing processes. The case studies provide relevant feedback for the sustainability of the platform in terms of potential usersâ needs and relevant information on the possible long term impact
Big data analytics:Computational intelligence techniques and application areas
Big Data has significant impact in developing functional smart cities and supporting modern societies. In this paper, we investigate the importance of Big Data in modern life and economy, and discuss challenges arising from Big Data utilization. Different computational intelligence techniques have been considered as tools for Big Data analytics. We also explore the powerful combination of Big Data and Computational Intelligence (CI) and identify a number of areas, where novel applications in real world smart city problems can be developed by utilizing these powerful tools and techniques. We present a case study for intelligent transportation in the context of a smart city, and a novel data modelling methodology based on a biologically inspired universal generative modelling approach called Hierarchical Spatial-Temporal State Machine (HSTSM). We further discuss various implications of policy, protection, valuation and commercialization related to Big Data, its applications and deployment
Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)
This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio
On the fate of Lorentz symmetry in loop quantum gravity and noncommutative spacetimes
I analyze the deformation of Lorentz symmetry that holds in certain
noncommutative spacetimes and the way in which Lorentz symmetry is broken in
other noncommutative spacetimes. I also observe that discretization of areas
does not necessarily require departures from Lorentz symmetry. This is due to
the fact that Lorentz symmetry has no implications for exclusive measurement of
the area of a surface, but it governs the combined measurements of the area and
the velocity of a surface. In a quantum-gravity theory Lorentz symmetry can be
consistent with area discretization, but only when the observables ``area of
the surface" and "velocity of the surface" enjoy certain special properties. I
argue that the status of Lorentz symmetry in the loop-quantum-gravity approach
requires careful scrutiny, since areas are discretized within a formalism that,
at least presently, does not include an observable "velocity of the surface".
In general it may prove to be very difficult to reconcile Lorentz symmetry with
area discretization in theories of canonical quantization of gravity, because a
proper description of Lorentz symmetry appears to require that the
fundamental/primary role be played by the surface's world-sheet, whose
"projection" along the space directions of a given observer describes the
observable area, whereas the canonical formalism only allows the introduction
as primary entities of observables defined at a fixed (common) time, and the
observers that can be considered must share that time variable.Comment: 59 pages, LaTe
A Comprehensive Review on Multimedia Retrieval Techniques
Abstract: With the prevalence of sight and sound advancements and web mediums, client can't fulfil with the customarey techniques for data retrieval systems. On account of this, the substance based picture recovery is turning into another and quick strategy for data recovery. Substance based picture recovery is the system for recovering the information especially pictures from a wide gathering of databases. The recovery is careried out by utilizing highlights. Content Based Image Retrieval (CBIR) is a system to compose the wide mixture of pictures by their visual highlight. Feature based recovery or retrieval procedures aree accessible for recovering the pictures, in our review we aree investigating them. In our first segment, we aree tending towareds a few nuts and bolts of a specific CBIR framework with that we have demonstrated some fundamental highlights of any picture, these aree similare to shape, surface, shading and indicated diverse systems to compute them. We have also demonstrated diverse separeation measuring systems utilized for closeness estimation of any picture furthermore talked about indexing methods. At last conclusion and future degree is examined.
DOI: 10.17762/ijritcc2321-8169.15061
- âŠ