15,895 research outputs found

    Ambulatory position and orientation tracking fusing magnetic and inertial sensing

    Get PDF
    This paper presents the design and testing of a portable magnetic system combined with miniature inertial sensors for ambulatory 6 degrees of freedom ( DOF) human motion tracking. The magnetic system consists of three orthogonal coils, the source, fixed to the body and 3-D magnetic sensors, fixed to remote body segments, which measure the fields generated by the source. Based on the measured signals, a processor calculates the relative positions and orientations between source and sensor. Magnetic actuation requires a substantial amount of energy which limits the update rate with a set of batteries. Moreover, the magnetic field can easily be disturbed by ferromagnetic materials or other sources. Inertial sensors can be sampled at high rates, require only little energy and do not suffer from magnetic interferences. However, accelerometers and gyroscopes can only measure changes in position and orientation and suffer from integration drift. By combing measurements from both systems in a complementary Kalman filter structure, an optimal solution for position and orientation estimates is obtained. The magnetic system provides 6 DOF measurements at a relatively low update rate while the inertial sensors track the changes position and orientation in between the magnetic updates. The implemented system is tested against a lab-bound camera tracking system for several functional body movements. The accuracy was about 5 mm for position and 3 degrees for orientation measurements. Errors were higher during movements with high velocities due to relative movement between source and sensor within one cycle of magnetic actuation

    Multisensor Data Fusion for Human Activities Classification and Fall Detection

    Get PDF
    Significant research exists on the use of wearable sensors in the context of assisted living for activities recognition and fall detection, whereas radar sensors have been studied only recently in this domain. This paper approaches the performance limitation of using individual sensors, especially for classification of similar activities, by implementing information fusion of features extracted from experimental data collected by different sensors, namely a tri-axial accelerometer, a micro-Doppler radar, and a depth camera. Preliminary results confirm that combining information from heterogeneous sensors improves the overall performance of the system. The classification accuracy attained by means of this fusion approach improves by 11.2% compared to radar-only use, and by 16.9% compared to the accelerometer. Furthermore, adding features extracted from a RGB-D Kinect sensor, the overall classification accuracy increases up to 91.3%

    Interactive and cooperative sensing and control for advanced teleoperation

    Get PDF
    This paper presents the paradigm of interactive and cooperative sensing and control as a fundamental mechanism of integrating and fusing the strengths of man and machine for advanced teleoperation. The interactive and cooperative sensing and control is considered as an extended and generalized form of traded and shared control. The emphasis of interactive and cooperative sensing and control is given to the distribution of mutually nonexclusive subtasks to man and machine, the interactive invocation of subtasks under the man/machine symbiotic relationship, and the fusion of information and decisionmaking between man and machine according to their confidence measures. The proposed interactive and cooperative sensing and control system is composed of such major functional blocks as the logical sensor system, the sensor-based local autonomy, the virtual environment formation, and the cooperative decision-making between man and machine. The Sensing-Knowledge-Command (SKC) fusion network is proposed as a fundamental architecture for implementing cooperative and interactive sensing and control. Simulation results are shown

    Environmental Sensing by Wearable Device for Indoor Activity and Location Estimation

    Full text link
    We present results from a set of experiments in this pilot study to investigate the causal influence of user activity on various environmental parameters monitored by occupant carried multi-purpose sensors. Hypotheses with respect to each type of measurements are verified, including temperature, humidity, and light level collected during eight typical activities: sitting in lab / cubicle, indoor walking / running, resting after physical activity, climbing stairs, taking elevators, and outdoor walking. Our main contribution is the development of features for activity and location recognition based on environmental measurements, which exploit location- and activity-specific characteristics and capture the trends resulted from the underlying physiological process. The features are statistically shown to have good separability and are also information-rich. Fusing environmental sensing together with acceleration is shown to achieve classification accuracy as high as 99.13%. For building applications, this study motivates a sensor fusion paradigm for learning individualized activity, location, and environmental preferences for energy management and user comfort.Comment: submitted to the 40th Annual Conference of the IEEE Industrial Electronics Society (IECON
    corecore