633 research outputs found

    A Particle Swarm Optimization inspired tracker applied to visual tracking

    Get PDF
    International audienceVisual tracking is dynamic optimization where time and object state simultaneously influence the problem. In this paper, we intend to show that we built a tracker from an evolutionary optimization approach, the PSO (Particle Swarm optimization) algorithm. We demonstrated that an extension of the original algorithm where system dynamics is explicitly taken into consideration, it can perform an efficient tracking. This tracker is also shown to outperform SIR (Sampling Importance Resampling) algorithm with random walk and constant velocity model, as well as a previously PSO inspired tracker, SPSO (Sequential Particle Swarm Optimization). Experiments were performed both on simulated data and real visual RGB-D information. Our PSO inspired tracker can be a very effective and robust alternative for visual tracking

    Etude de l'IHR sur deux groupes de personnes agées

    Get PDF
    International audienceWe used PR2 robot, in autonomous operation in a living lab setting, to provide an object search service to elderly volunteers (familiar to robots or naĂŻve). Observation was complemented by semi-directed interviews. There was no significant difference between the groups either in the successful detection of the willingness to interact or the appreciation of voice interaction. This fosters dedicated HCI development for the elderly.Nous avons Ă©tudiĂ© l'interaction homme-robot, en fonctionnement autonome en environnement contrĂŽlĂ©, de PR2ℱ utilisĂ© pour rechercher des objets avec des sujets ĂągĂ©s (avec ou sans expertise robotique). L'observation a Ă©tĂ© complĂ©tĂ©e d'entretiens semi-directifs. Il n'y a pas eu de diffĂ©rence significative entre ces deux groupes pour le succĂšs de la dĂ©tection d'intentionnalitĂ© et la perception de l'interaction vocale. Ce rĂ©sultat est en faveur d'IHM prenant en compte les spĂ©cificitĂ©s de la personne ĂągĂ©e

    Perceiving user's intention-for-interaction: A probabilistic multimodal data fusion scheme

    Get PDF
    International audienceUnderstanding people's intention, be it action or thought, plays a fundamental role in establishing coherent communication amongst people, especially in non-proactive robotics, where the robot has to understand explicitly when to start an interaction in a natural way. In this work, a novel approach is presented to detect people's intention-for-interaction. The proposed detector fuses multimodal cues, including estimated head pose, shoulder orientation and vocal activity detection, using a probabilistic discrete state Hidden Markov Model. The multimodal detector achieves up to 80% correct detection rates improving purely audio and RGB-D based variants

    A Multi-modal Perception based Architecture for a Non-intrusive Domestic Assistant Robot

    Get PDF
    International audienceWe present a multi-modal perception based architecture to realize a non-intrusive domestic assistant robot. The realized robot is non-intrusive in that it only starts interaction with a user when it detects the user's intention to do so automatically. All the robot's actions are based on multi-modal perceptions, which include: user detection based on RGB-D data, user's intention-for-interaction detection with RGB-D and audio data, and communication via speech recognition. The utilization of multi-modal cues in different parts of the robotic activity paves the way to successful robotic runs

    Segregation in desiccated sessile drops of biological fluids

    Full text link
    It is shown here that concurrence between advection and diffusion in a drying sessile drop of a biological fluid can produce spatial redistribution of albumen and salt. The result gives an explanation for the patterns observed in the dried drops of the biological fluids.Comment: 6 pages, 3 figures; submitted to European Physical Journal

    Quels sont les objets égarés à domicile par les personnes ùgées fragiles ? Une étude pilote sur 60 personnes

    Get PDF
    National audienceLoosing objects is a cause of conflicts between frail elderlies and their caregivers. To our knowledge, the literature addressing delusion of theft doesn’t provide information on the objects that are involved. In the RIDDLE project, we are using a companion robot to help the elderly find the objects they are looking for. Hence, we initiated a study with the cross interviews of 60 patient/caregiver dyads to identify which objects would be most relevant to them. Objects are looked for by the patient according to 72 % of the patients and 82 % of the caregivers. The most commonly looked for objects, when they are in use by the patient, are: spectacles (45 %), house keys (34 %), mobile (31 %), wallet (26 %), remote control (19 %), and cane (22 %). After rigging the localization technology to the afore-mentioned objects, the related service will have to be customized to the ways of the user.La perte d’objets cause des conflits entre les personnes ĂągĂ©es fragiles et leur famille. Le projet Riddle utilise un robot compagnon pour aider des personnes ĂągĂ©es Ă  retrouver des objets. La bibliographie sur le dĂ©lire de vol ne donne pas de liste d’objets recherchĂ©s. L’objectif est de dĂ©finir les objets les plus pertinents Ă  localiser en rĂ©alisant un interrogatoire croisĂ©, sĂ©parĂ©ment, de 60 couples patient/aidant. Soixante-douze pour cent des patients recherchent des objets (82 % pour les aidants). Les objets utilisĂ©s les plus recherchĂ©s sont : lunettes (45 %), clĂ©s de maison (34 %), tĂ©lĂ©phone portable (31 %), porte-monnaie (26 %), tĂ©lĂ©commande (19 %), canne (22 %). AprĂšs Ă©quipement technique des objets ainsi dĂ©finis, la mise en Ɠuvre du service d’aide devra tenir compte de l’usage individuel

    Evaporation induced flow inside circular wells

    Full text link
    Flow field and height averaged radial velocity inside a droplet evaporating in an open circular well were calculated for different modes of liquid evaporation.Comment: 5 page, 3 figures, submitted to European Physical Journal
    • 

    corecore