33 research outputs found

    Belief Scheduler based on model failure detection in the TBM framework. Application to human activity recognition.

    Get PDF
    International audienceA tool called Belief Scheduler is proposed for state sequence recognition in the Transferable Belief Model (TBM) framework. This tool makes noisy temporal belief functions smoother using a Temporal Evidential Filter (TEF). The Belief Scheduler makes belief on states smoother, separates the states (assumed to be true or false) and synchronizes them in order to infer the sequence. A criterion is also provided to assess the appropriateness between observed belief functions and a given sequence model. This criterion is based on the conflict information appearing explicitly in the TBM when combining observed belief functions with predictions. The Belief Scheduler is part of a generic architecture developed for on-line and automatic human action and activity recognition in videos of athletics taken with a moving camera. In experiments, the system is assessed on a database composed of 69 real athletics video sequences. The goal is to automatically recognize running, jumping, falling and standing-up actions as well as high jump, pole vault, triple jump and {long jump activities of an athlete. A comparison with Hidden Markov Models for video classification is also provided

    Théorie de l’évidence pour suivi de visage

    Get PDF
    Le suivi de visage par caméra vidéo est abordé ici sous l’angle de la fusion évidentielle. La méthode proposée repose sur un apprentissage sommaire basé sur une initialisation supervisée. Le formalisme du modèle de croyances transférables est utilisé pour pallier l’incomplétude du modèle a priori de visage due au manque d’exhaustivité de la base d’apprentissage. L’algorithme se décompose en deux étapes. La phase de détection de visage synthétise un modèle évidentiel où les attributs du détecteur de Viola et Jones sont convertis en fonctions de croyance, et fusionnés avec des fonctions de masse couleur modélisant un détecteur de teinte chair, opérant dans un espace chromatique original obtenu par transformation logarithmique. Pour fusionner les sources couleur dépendantes, nous proposons un opérateur de compromis inspiré de la règle prudente de Denœux. Pour la phase de suivi, les probabilités pignistiques issues du modèle de visage garantissent la compatibilité entre les cadres crédibiliste et probabiliste. Elles alimentent un filtre particulaire classique qui permet le suivi du visage en temps réel. Nous analysons l’influence des paramètres du modèle évidentiel sur la qualité du suivi.This paper deals with real time face detection and tracking by a video camera. The method is based on a simple and fast initializing stage for learning. The transferable belief model is used to deal with the prior model incompleteness due to the lack of exhaustiveness of the learning stage. The algorithm works in two steps. The detection phase synthesizes an evidential face model by merging basic beliefs elaborated from the Viola and Jones face detector and from colour mass functions. These functions are computed from information sources in a logarithmic colour space. To deal with the colour information dependence in the fusion process, we propose a compromise operator close to the Denœux cautious rule. As regards the tracking phase, the pignistic probabilities from the face model guarantee the compatibility between the believes and the probability formalism. They are the inputs of a particle filter which ensures face tracking at video rate. The optimal parameter tuning of the evidential model is discussed

    Virtual Reality Games for Motor Rehabilitation

    Get PDF
    This paper presents a fuzzy logic based method to track user satisfaction without the need for devices to monitor users physiological conditions. User satisfaction is the key to any product’s acceptance; computer applications and video games provide a unique opportunity to provide a tailored environment for each user to better suit their needs. We have implemented a non-adaptive fuzzy logic model of emotion, based on the emotional component of the Fuzzy Logic Adaptive Model of Emotion (FLAME) proposed by El-Nasr, to estimate player emotion in UnrealTournament 2004. In this paper we describe the implementation of this system and present the results of one of several play tests. Our research contradicts the current literature that suggests physiological measurements are needed. We show that it is possible to use a software only method to estimate user emotion

    Topology control and data handling in wireless sensor networks

    Get PDF
    Our work in this thesis have provided two distinctive contributions to WSNs in the areas of data handling and topology control. In the area of data handling, we have demonstrated a solution to improve the power efficiency whilst preserving the important data features by data compression and the use of an adaptive sampling strategy, which are applicable to the specific application for oceanography monitoring required by the SECOAS project. Our work on oceanographic data analysis is important for the understanding of the data we are dealing with, such that suitable strategies can be deployed and system performance can be analysed. The Basic Adaptive Sampling Scheduler (BASS) algorithm uses the statistics of the data to adjust the sampling behaviour in a sensor node according to the environment in order to conserve energy and minimise detection delay. The motivation of topology control (TC) is to maintain the connectivity of the network, to reduce node degree to ease congestion in a collision-based medium access scheme; and to reduce power consumption in the sensor nodes. We have developed an algorithm Subgraph Topology Control (STC) that is distributed and does not require additional equipment to be implemented on the SECOAS nodes. STC uses a metric called subgraph number, which measures the 2-hops connectivity in the neighbourhood of a node. It is found that STC consistently forms topologies that have lower node degrees and higher probabilities of connectivity, as compared to k-Neighbours, an alternative algorithm that does not rely on special hardware on sensor node. Moreover, STC also gives better results in terms of the minimum degree in the network, which implies that the network structure is more robust to a single point of failure. As STC is an iterative algorithm, it is very scalable and adaptive and is well suited for the SECOAS applications

    Knowledge Acquisition Analytical Games: games for cognitive systems design

    Get PDF
    Knowledge discovery from data and knowledge acquisition from experts are steps of paramount importance when designing cognitive systems. The literature discusses extensively on the issues related to current knowledge acquisition techniques. In this doctoral work we explore the use of gaming approaches as a knowledge acquisition tools, capitalising on aspects such as engagement, ease of use and ability to access tacit knowledge. More specifically, we explore the use of analytical games for this purpose. Analytical game for decision making is not a new class of games, but rather a set of platform independent simulation games, designed not for entertainment, whose main purpose is research on decision-making, either in its complete dynamic cycle or a portion of it (i.e. Situational Awareness). Moreover, the work focuses on the use of analytical games as knowledge acquisition tools. To this end, the Knowledge Acquisition Analytical Game (K2AG) method is introduced. K2AG is an innovative game framework for supporting the knowledge acquisition task. The framework introduced in this doctoral work was born as a generalisation of the Reliability Game, which on turn was inspired by the Risk Game. More specifically, K2AGs aim at collecting information and knowledge to be used in the design of cognitive systems and their algorithms. The two main aspects that characterise those games are the use of knowledge cards to render information and meta-information to the players and the use of an innovative data gathering method that takes advantage of geometrical features of simple shapes (e.g. a triangle) to easily collect players\u2019 beliefs. These beliefs can be mapped to subjective probabilities or masses (in evidence theory framework) and used for algorithm design purposes. However, K2AGs might use also different means of conveying information to the players and to collect data. Part of the work has been devoted to a detailed articulation of the design cycle of K2AGs. More specifically, van der Zee\u2019s simulation gaming design framework has been extended in order to account for the fact that the design cycle steps should be modified to include the different kinds of models that characterise the design of simulation games and simulations in general, namely a conceptual model (platform independent), a design model (platform independent) and one or more implementation models (platform dependent). In addition, the processes that lead from one model to the other have been mapped to design phases of analytical wargaming. Aspects of game validation and player experience evaluation have been addressed in this work. Therefore, based on the literature a set of validation criteria for K2AG has been proposed and a player experience questionnaire for K2AGs has been developed. This questionnaire extends work proposed in the literature, but a validation has not been possible at the time of writing. Finally, two instantiations of the K2AG framework, namely the Reliability Game and the MARISA Game, have been designed and analysed in details to validate the approach and show its potentialities

    Artificial Intelligence and Ambient Intelligence

    Get PDF
    This book includes a series of scientific papers published in the Special Issue on Artificial Intelligence and Ambient Intelligence at the journal Electronics MDPI. The book starts with an opinion paper on “Relations between Electronics, Artificial Intelligence and Information Society through Information Society Rules”, presenting relations between information society, electronics and artificial intelligence mainly through twenty-four IS laws. After that, the book continues with a series of technical papers that present applications of Artificial Intelligence and Ambient Intelligence in a variety of fields including affective computing, privacy and security in smart environments, and robotics. More specifically, the first part presents usage of Artificial Intelligence (AI) methods in combination with wearable devices (e.g., smartphones and wristbands) for recognizing human psychological states (e.g., emotions and cognitive load). The second part presents usage of AI methods in combination with laser sensors or Wi-Fi signals for improving security in smart buildings by identifying and counting the number of visitors. The last part presents usage of AI methods in robotics for improving robots’ ability for object gripping manipulation and perception. The language of the book is rather technical, thus the intended audience are scientists and researchers who have at least some basic knowledge in computer science

    Réseaux Évidentiels pour la fusion de données multimodales hétérogènes (application à la détection de chutes)

    Get PDF
    Ces travaux de recherche se sont déroulés dans le cadre du développement d une application de télévigilance médicale ayant pour but de détecter des situations de détresse à travers l utilisation de plusieurs types de capteurs. La fusion multi-capteurs peut fournir des informations plus précises et fiables par rapport aux informations provenant de chaque capteur prises séparément. Par ailleurs les données issues de ces capteurs hétérogènes possèdent différents degrés d imperfection et de confiance. Parmi les techniques de fusion multi-capteurs, les méthodes crédibilistes fondées sur la théorie de Dempster-Shafer sont actuellement considérées comme les plus adaptées à la représentation et au traitement des informations imparfaites, de ce fait permettant une modélisation plus réaliste du problème. En nous appuyant sur une représentation graphique de la théorie de Dempster-Shafer appelée Réseaux Évidentiels, nous proposons une structure de fusion de données hétérogènes issues de plusieurs capteurs pour la détection de chutes afin de maximiser les performances de détection chutes et ainsi de rendre le système plus fiable. La non-stationnarité des signaux recueillis sur les capteurs du système considéré peut conduire à une dégradation des conditions expérimentales, pouvant rendre les Réseaux Évidentiels incohérents dans leurs décisions. Afin de compenser les effets résultant de la non-stationnarité des signaux provenant des capteurs, les Réseaux Évidentiels sont rendus évolutifs dans le temps, ce qui nous a conduit à introduire les Réseaux Evidentiels Dynamiques dans nos traitements et à les évaluer sur des scénarios de chute simulés correspondant à des cas d usage variésThis work took place in the development of a remote home healthcare monitoring application designed to detect distress situations through several types of sensors. The multi-sensor fusion can provide more accurate and reliable information compared to information provided by each sensor separately. Furthermore, data from multiple heterogeneous sensors present in the remote home healthcare monitoring systems have different degrees of imperfection and trust. Among the multi-sensor fusion techniques, belief methods based on Dempster-Shafer Theory are currently considered as the most appropriate for the representation and processing of imperfect information, thus allowing a more realistic modeling of the problem. Based on a graphical representation of the Dempster-Shafer called Evidential Networks, a structure of heterogeneous data fusion from multiple sensors for fall detection has been proposed in order to maximize the performance of automatic fall detection and thus make the system more reliable. Sensors non-stationary signals of the considered system may lead to degradation of the experimental conditions and make Evidential Networks inconsistent in their decisions. In order to compensate the sensors signals non-stationarity effects, the time evolution is taken into account by introducing the Dynamic Evidential Networks which was evaluated by the simulated fall scenarios corresponding to various use casesEVRY-INT (912282302) / SudocSudocFranceF

    First Annual Workshop on Space Operations Automation and Robotics (SOAR 87)

    Get PDF
    Several topics relative to automation and robotics technology are discussed. Automation of checkout, ground support, and logistics; automated software development; man-machine interfaces; neural networks; systems engineering and distributed/parallel processing architectures; and artificial intelligence/expert systems are among the topics covered

    Advanced power system protection and incipient fault detection and protection of spaceborne power systems

    Get PDF
    This research concentrated on the application of advanced signal processing, expert system, and digital technologies for the detection and control of low grade, incipient faults on spaceborne power systems. The researchers have considerable experience in the application of advanced digital technologies and the protection of terrestrial power systems. This experience was used in the current contracts to develop new approaches for protecting the electrical distribution system in spaceborne applications. The project was divided into three distinct areas: (1) investigate the applicability of fault detection algorithms developed for terrestrial power systems to the detection of faults in spaceborne systems; (2) investigate the digital hardware and architectures required to monitor and control spaceborne power systems with full capability to implement new detection and diagnostic algorithms; and (3) develop a real-time expert operating system for implementing diagnostic and protection algorithms. Significant progress has been made in each of the above areas. Several terrestrial fault detection algorithms were modified to better adapt to spaceborne power system environments. Several digital architectures were developed and evaluated in light of the fault detection algorithms
    corecore