21 research outputs found

    A Wearable Sensor Network for Gait Analysis: A 6-Day Experiment of Running Through the Desert

    Get PDF
    International audienceThis paper presents a new system for analysis of walking and running gaits. The system is based on a network of wireless nodes with various types of embedded sensors. It has been designed to allow long-term recording in outdoor environments and was tested during the 2010 "Sultan Marathon des Sables" desert race. A runner was fitted with the sensory network for six days of the competition. Although technical problems have limited the amount of data recorded, the experiment was nevertheless suc- cessful: the system did not interfere with the runner, who finished with a high ranking, the concept was validated and high quality data were ac- quired. It should be noted that the loss of some of the measurements was mainly due to problems with the cable connectors between the nodes and batteries. In this paper, we describe the technical aspects of the system developed, the experimental conditions under which it was validated, and give examples of the data obtained with some preliminary processing

    Occupancy grids from stereo and optical flow data

    Get PDF
    voir basilic : http://emotion.inrialpes.fr/bibemotion/2006/BPUCL06/ address: Rio de Janeiro (BR)In this paper, we propose a real-time method to detect obstacles using theoretical models of the ground plane, first in a 3D point cloud given by a stereo camera, and then in an optical flow field given by one of the stereo pair's camera. The idea of our method is to combine two partial occupancy grids from both sensor modalities with an occupancy grid framework. The two methods do not have the same range, precision and resolution. For example, the stereo method is precise for close objects but cannot see further than 7 m (with our lenses), while the optical flow method can see considerably further but has lower accuracy. Experiments that have been carried on the CyCab mobile robot and on a tractor demonstrate that we can combine the advantages of both algorithms to build local occupancy grids from incomplete data (optical flow from a monocular camera cannot give depth information without time integration)

    An Autonomous Car-Like Robot Navigating Safely Among Pedestrians

    Get PDF
    voir basilic : http://emotion.inrialpes.fr/bibemotion/2004/PHKBBL04/ address: New Orleans, LA (US)The recent development of a new kind of public transporlation system relies on a particular douhlesteering kinematic structure enhancing maneuverability in clulteml environments such as downtown areas. We call bi-steerable car a vehicle showing this kind of kinematics. Endowed with autonomy Capacities, the hi-steerahle car ought to combine suitably and safely a se1 of abilities: simultaneous localisation and environment modelling, motion planning and motion execution amidst moderately dynamic obstacles. In this paper we address the integration of these four essential autonomy abilities into a single application. Specifically, we aim at reactive execution of planned motion. We address the fusion of controls issued from the control law and the obstacle avoidance module using prohahilistic techniques

    Safe and Autonomous Navigation for a Car-Like Robot Among Pedestrian

    Get PDF
    voir basilic : http://emotion.inrialpes.fr/bibemotion/2003/PHKBBL03/ address: Madrid (ES)The recent development of a new kind of public transportation system relies on a particular double-steering kinematic structure enhancing maneuverability in cluttered environments such as downtown areas. We call bi-steerable car a vehicle showing this kind of kinematics. Endowed with autonomy capacities, the bi-steerable car ought to combine suitably and safely a set of abilities: simultaneous localization and environment modeling, motion planning and motion execution amidst dynamic obstacles. In this paper we address the integration of these four essential autonomy abilities into a single application. Specifically, we aim at reactive execution of planned motion. We address the fusion of controls issued from the control law and the obstacle avoidance module using probabilistic techniques

    Utilisation du flux optique pour le détection d'obstacles et la navigation

    No full text
    voir basilic : http://emotion.inrialpes.fr/bibemotion/2004/Bra04/ school: INPG address: Grenoble (FR

    Utilisation du flux optique pour le détection d'obstacles et la navigation

    No full text
    voir basilic : http://emotion.inrialpes.fr/bibemotion/2004/Bra04/ school: INPG address: Grenoble (FR

    Real-time Time-To-Collision from variation of Intrinsic Scale

    Get PDF
    Summary. Time-to-collision can be directly measured from a spatio-temporal image sequence obtained from an uncalibrated camera. This it would appear to offer a simple, elegant measurement for use in obstacle avoidance. However, previous techniques for computing time to collision from an optical flow have proven impractical for real applications. This paper present a new approach for computing time to collision (TTC) based on the idea of measuring the rate of change of the "intrinsic scale". Intrinsic scale is a geometric invariant that is valid at most points in an image, and can be rapidly determined using a multi-resolution pyramid. In this paper we develop the approach and demonstrate its feasibility by comparing the results with range measurements obtained from a laser ranging device on a moving vehicle. Experimental results show that this is a simple method to obtain reliable TTC with a low computational cost.

    Real-time stereo and optical flow data fusion

    No full text
    voir basilic : http://emotion.inrialpes.fr/bibemotion/2006/BUPCL06a/ address: Beijing (CN

    Occupancy grids from stereo and optical flow data

    Get PDF
    voir basilic : http://emotion.inrialpes.fr/bibemotion/2006/BPUCL06/ address: Rio de Janeiro (BR)In this paper, we propose a real-time method to detect obstacles using theoretical models of the ground plane, first in a 3D point cloud given by a stereo camera, and then in an optical flow field given by one of the stereo pair's camera. The idea of our method is to combine two partial occupancy grids from both sensor modalities with an occupancy grid framework. The two methods do not have the same range, precision and resolution. For example, the stereo method is precise for close objects but cannot see further than 7 m (with our lenses), while the optical flow method can see considerably further but has lower accuracy. Experiments that have been carried on the CyCab mobile robot and on a tractor demonstrate that we can combine the advantages of both algorithms to build local occupancy grids from incomplete data (optical flow from a monocular camera cannot give depth information without time integration)
    corecore