2,155 research outputs found

    A wearable solution for accurate step detection based on the direct measurement of the inter-foot distance

    Get PDF
    Accurate step detection is crucial for the estimation of gait spatio-temporal parameters. Although several step detection methods based on the use of inertial measurement units (IMUs) have been successfully proposed, they may not perform adequately when the foot is dragged while walking, when walking aids are used, or when walking at low speed. The aim of this study was to test an original step-detection method, the inter-foot distance step counter (IFOD), based on the direct measurement of the distance between feet. Gait data were recorded using a wearable prototype system (SWING2DS), which integrates an IMU and two time-of-flight distance sensors (DSs). The system was attached to the medial side of the right foot with one DS positioned close to the forefoot (FOREDS) and the other close to the rearfoot (REARDS). Sixteen healthy adults were asked to walk over ground for two minutes along a loop, including both rectilinear and curvilinear portions, during two experimental sessions. The accuracy of the IFOD step counter was assessed using a stereo-photogrammetric system as gold standard. The best performance was obtained for REARDS with an accuracy higher than 99.8% for the instrumented foot step and 88.8% for the non-instrumented foot step during both rectilinear and curvilinear walks. Key features of the IFOD step counter are that it is possible to detect both right and left steps by instrumenting one foot only and that it does not rely on foot impact dynamics. The IFOD step counter can be combined with existing IMU-based methods for increasing step-detection accuracy

    An Indoor Navigation System Using a Sensor Fusion Scheme on Android Platform

    Get PDF
    With the development of wireless communication networks, smart phones have become a necessity for people’s daily lives, and they meet not only the needs of basic functions for users such as sending a message or making a phone call, but also the users’ demands for entertainment, surfing the Internet and socializing. Navigation functions have been commonly utilized, however the navigation function is often based on GPS (Global Positioning System) in outdoor environments, whereas a number of applications need to navigate indoors. This paper presents a system to achieve high accurate indoor navigation based on Android platform. To do this, we design a sensor fusion scheme for our system. We divide the system into three main modules: distance measurement module, orientation detection module and position update module. We use an efficient way to estimate the stride length and use step sensor to count steps in distance measurement module. For orientation detection module, in order to get the optimal result of orientation, we then introduce Kalman filter to de-noise the data collected from different sensors. In the last module, we combine the data from the previous modules and calculate the current location. Results of experiments show that our system works well and has high accuracy in indoor situations

    System based on inertial sensors for behavioral monitoring of wildlife

    Get PDF
    Sensors Network is an integration of multiples sensors in a system to collect information about different environment variables. Monitoring systems allow us to determine the current state, to know its behavior and sometimes to predict what it is going to happen. This work presents a monitoring system for semi-wild animals that get their actions using an IMU (inertial measure unit) and a sensor fusion algorithm. Based on an ARM-CortexM4 microcontroller this system sends data using ZigBee technology of different sensor axis in two different operations modes: RAW (logging all information into a SD card) or RT (real-time operation). The sensor fusion algorithm improves both the precision and noise interferences.Junta de Andalucía P12-TIC-130

    Wearable Inertial Measurement Units for Assessing Gait in Real-World Environments

    Full text link
    Walking patterns can provide important indications of a person’s health status and be beneficial in the early diagnosis of individuals with a potential walking disorder. For appropriate gait analysis, it is critical that natural functional walking characteristics are captured, rather than those experienced in artificial or observed settings. To better understand the extent to which setting influences gait patterns, and particularly whether observation plays a varying role on subjects of different ages, the current study investigates to what extent people walk differentl

    Towards Human Motion Tracking Enhanced by Semi-Continuous Ultrasonic Time-of-Flight Measurements

    Get PDF
    Human motion analysis is a valuable tool for assessing disease progression in persons with conditions such as multiple sclerosis or Parkinson’s disease. Human motion tracking is also used extensively for sporting technique and performance analysis as well as for work life ergonomics evaluations. Wearable inertial sensors (e.g., accelerometers, gyroscopes and/or magnetometers) are frequently employed because they are easy to mount and can be used in real life, out-of-the-lab settings, as opposed to video-based lab setups. These distributed sensors cannot, however, measure relative distances between sensors, and are also cumbersome when it comes to calibration and drift compensation. In this study, we tested an ultrasonic time-of-flight sensor for measuring relative limb-to-limb distance, and we developed a combined inertial sensor and ultrasonic time-of-flight wearable measurement system. The aim was to investigate if ultrasonic time-of-flight sensors can supplement inertial sensor-based motion tracking by providing relative distances between inertial sensor modules. We found that the ultrasonic time-of-flight measurements reflected expected walking motion patterns. The stride length estimates derived from ultrasonic time-of-flight measurements corresponded well with estimates from validated inertial sensors, indicating that the inclusion of ultrasonic time-of flight measurements could be a feasible approach for improving inertial sensor-only systems. Our prototype was able to measure both inertial and time-of-flight measurements simultaneously and continuously, but more work is necessary to merge the complementary approaches to provide more accurate and more detailed human motion tracking.publishedVersio

    Keyframe-based visual–inertial odometry using nonlinear optimization

    Get PDF
    Combining visual and inertial measurements has become popular in mobile robotics, since the two sensing modalities offer complementary characteristics that make them the ideal choice for accurate visual–inertial odometry or simultaneous localization and mapping (SLAM). While historically the problem has been addressed with filtering, advancements in visual estimation suggest that nonlinear optimization offers superior accuracy, while still tractable in complexity thanks to the sparsity of the underlying problem. Taking inspiration from these findings, we formulate a rigorously probabilistic cost function that combines reprojection errors of landmarks and inertial terms. The problem is kept tractable and thus ensuring real-time operation by limiting the optimization to a bounded window of keyframes through marginalization. Keyframes may be spaced in time by arbitrary intervals, while still related by linearized inertial terms. We present evaluation results on complementary datasets recorded with our custom-built stereo visual–inertial hardware that accurately synchronizes accelerometer and gyroscope measurements with imagery. A comparison of both a stereo and monocular version of our algorithm with and without online extrinsics estimation is shown with respect to ground truth. Furthermore, we compare the performance to an implementation of a state-of-the-art stochastic cloning sliding-window filter. This competitive reference implementation performs tightly coupled filtering-based visual–inertial odometry. While our approach declaredly demands more computation, we show its superior performance in terms of accuracy

    Development of a Standalone Pedestrian Navigation System Utilizing Sensor Fusion Strategies

    Get PDF
    Pedestrian inertial navigation systems yield the foundational information required for many possible indoor navigation and positioning services and applications, but current systems have difficulty providing accurate locational information due to system instability. Through the implementation of a low-cost ultrasonic ranging device added to a foot-mounted inertial navigation system, the ability to detect surrounding obstacles, such as walls, is granted. Using these detected walls as a basis of correction, an intuitive algorithm that can be added to already established systems was developed that allows for the demonstrable reduction of final location errors. After a 160 m walk, final location errors were reduced from 8.9 m to 0.53 m, a reduction of 5.5% of the total distance walked. Furthermore, during a 400 m walk the peak error was reduced from 10.3 m to 1.43 m. With long term system accuracy and stability being largely dependent on the ability of gyroscopes to accurately estimate changes in yaw angle, the purposed system helps correct these inaccuracies, providing strong plausible implementation in obstacle rich environments such as those found indoors

    Wearable inertial sensor system towards daily human kinematic gait analysis: benchmarking analysis to MVN BIOMECH

    Get PDF
    This paper presents a cost- and time-effective wearable inertial sensor system, the InertialLAB. It includes gyroscopes and accelerometers for the real-time monitoring of 3D-angular velocity and 3D-acceleration of up to six lower limbs and trunk segment and sagittal joint angle up to six joints. InertialLAB followed an open architecture with a low computational load to be executed by wearable processing units up to 200 Hz for fostering kinematic gait data to third-party systems, advancing similar commercial systems. For joint angle estimation, we developed a trigonometric method based on the segments’ orientation previously computed by fusion-based methods. The validation covered healthy gait patterns in varying speed and terrain (flat, ramp, and stairs) and including turns, extending the experiments approached in the literature. The benchmarking analysis to MVN BIOMECH reported that InertialLAB provides more reliable measures in stairs than in flat terrain and ramp. The joint angle time-series of InertialLAB showed good waveform similarity (>0.898) with MVN BIOMECH, resulting in high reliability and excellent validity. User-independent neural network regression models successfully minimized the drift errors observed in InertialLAB’s joint angles (NRMSE < 0.092). Further, users ranked InertialLAB as good in terms of usability. InertialLAB shows promise for daily kinematic gait analysis and real-time kinematic feedback for wearable third-party systems.This work has been supported in part by the Fundação para a Ciência e Tecnologia (FCT) with the Reference Scholarship under Grant SFRH/BD/108309/2015 and SFRH/BD/147878/2019, by the FEDER Funds through the Programa Operacional Regional do Norte and national funds from FCT with the project SmartOs under Grant NORTE-01-0145-FEDER-030386, and through the COMPETE 2020—Programa Operacional Competitividade e Internacionalização (POCI)—with the Reference Project under Grant POCI-01-0145-FEDER-006941

    Cooperative localization by dual foot-mounted inertial sensors and inter-agent ranging

    Full text link
    The implementation challenges of cooperative localization by dual foot-mounted inertial sensors and inter-agent ranging are discussed and work on the subject is reviewed. System architecture and sensor fusion are identified as key challenges. A partially decentralized system architecture based on step-wise inertial navigation and step-wise dead reckoning is presented. This architecture is argued to reduce the computational cost and required communication bandwidth by around two orders of magnitude while only giving negligible information loss in comparison with a naive centralized implementation. This makes a joint global state estimation feasible for up to a platoon-sized group of agents. Furthermore, robust and low-cost sensor fusion for the considered setup, based on state space transformation and marginalization, is presented. The transformation and marginalization are used to give the necessary flexibility for presented sampling based updates for the inter-agent ranging and ranging free fusion of the two feet of an individual agent. Finally, characteristics of the suggested implementation are demonstrated with simulations and a real-time system implementation.Comment: 14 page
    corecore