4,708 research outputs found

    Gait analysis in a box: A system based on magnetometer-free IMUs or clusters of optical markers with automatic event detection

    Get PDF
    Gait analysis based on full-body motion capture technology (MoCap) can be used in rehabilitation to aid in decision making during treatments or therapies. In order to promote the use of MoCap gait analysis based on inertial measurement units (IMUs) or optical technology, it is necessary to overcome certain limitations, such as the need for magnetically controlled environments, which affect IMU systems, or the need for additional instrumentation to detect gait events, which affects IMUs and optical systems. We present a MoCap gait analysis system called Move Human Sensors (MH), which incorporates proposals to overcome both limitations and can be configured via magnetometer-free IMUs (MH-IMU) or clusters of optical markers (MH-OPT). Using a test–retest reliability experiment with thirty-three healthy subjects (20 men and 13 women, 21.7 ± 2.9 years), we determined the reproducibility of both configurations. The assessment confirmed that the proposals performed adequately and allowed us to establish usage considerations. This study aims to enhance gait analysis in daily clinical practice

    Recognition of elementary arm movements using orientation of a tri-axial accelerometer located near the wrist

    No full text
    In this paper we present a method for recognising three fundamental movements of the human arm (reach and retrieve, lift cup to mouth, rotation of the arm) by determining the orientation of a tri-axial accelerometer located near the wrist. Our objective is to detect the occurrence of such movements performed with the impaired arm of a stroke patient during normal daily activities as a means to assess their rehabilitation. The method relies on accurately mapping transitions of predefined, standard orientations of the accelerometer to corresponding elementary arm movements. To evaluate the technique, kinematic data was collected from four healthy subjects and four stroke patients as they performed a number of activities involved in a representative activity of daily living, 'making-a-cup-of-tea'. Our experimental results show that the proposed method can independently recognise all three of the elementary upper limb movements investigated with accuracies in the range 91–99% for healthy subjects and 70–85% for stroke patients

    Human Motion Analysis with Wearable Inertial Sensors

    Get PDF
    High-resolution, quantitative data obtained by a human motion capture system can be used to better understand the cause of many diseases for effective treatments. Talking about the daily care of the aging population, two issues are critical. One is to continuously track motions and position of aging people when they are at home, inside a building or in the unknown environment; the other is to monitor their health status in real time when they are in the free-living environment. Continuous monitoring of human movement in their natural living environment potentially provide more valuable feedback than these in laboratory settings. However, it has been extremely challenging to go beyond laboratory and obtain accurate measurements of human physical activity in free-living environments. Commercial motion capture systems produce excellent in-studio capture and reconstructions, but offer no comparable solution for acquisition in everyday environments. Therefore in this dissertation, a wearable human motion analysis system is developed for continuously tracking human motions, monitoring health status, positioning human location and recording the itinerary. In this dissertation, two systems are developed for seeking aforementioned two goals: tracking human body motions and positioning a human. Firstly, an inertial-based human body motion tracking system with our developed inertial measurement unit (IMU) is introduced. By arbitrarily attaching a wearable IMU to each segment, segment motions can be measured and translated into inertial data by IMUs. A human model can be reconstructed in real time based on the inertial data by applying high efficient twists and exponential maps techniques. Secondly, for validating the feasibility of developed tracking system in the practical application, model-based quantification approaches for resting tremor and lower extremity bradykinesia in Parkinson’s disease are proposed. By estimating all involved joint angles in PD symptoms based on reconstructed human model, angle characteristics with corresponding medical ratings are employed for training a HMM classifier for quantification. Besides, a pedestrian positioning system is developed for tracking user’s itinerary and positioning in the global frame. Corresponding tests have been carried out to assess the performance of each system

    Overcoming Bandwidth Limitations in Wireless Sensor Networks by Exploitation of Cyclic Signal Patterns: An Event-triggered Learning Approach

    Get PDF
    Wireless sensor networks are used in a wide range of applications, many of which require real-time transmission of the measurements. Bandwidth limitations result in limitations on the sampling frequency and number of sensors. This problem can be addressed by reducing the communication load via data compression and event-based communication approaches. The present paper focuses on the class of applications in which the signals exhibit unknown and potentially time-varying cyclic patterns. We review recently proposed event-triggered learning (ETL) methods that identify and exploit these cyclic patterns, we show how these methods can be applied to the nonlinear multivariable dynamics of three-dimensional orientation data, and we propose a novel approach that uses Gaussian process models. In contrast to other approaches, all three ETL methods work in real time and assure a small upper bound on the reconstruction error. The proposed methods are compared to several conventional approaches in experimental data from human subjects walking with a wearable inertial sensor network. They are found to reduce the communication load by 60–70%, which implies that two to three times more sensor nodes could be used at the same bandwidth

    Enhanced tracking system based on micro inertial measurements unit to measure sensorimotor responses in pigeons

    Get PDF
    The ability to orientate and navigate is critically important for the survival of all migratory birds and other animals. Progress in understanding the mechanisms underlying these capabilities and, in particular, the importance of a sensitivity to the Earth’s magnetic field has, thus far, been constrained by the limited number of techniques available for the analysis of often complex behavioural responses. Methods used to track the movements of animals, such as birds, have varied depending on the degree of accuracy required. Most conventional approaches involve the use of a camera for recording and then measuring an animal's head movements in response to a variety of external stimuli, such as changes in magnetic fields. However, video tracking analysis (VTA) will generally only provide a 2D tracking of head angle. Moreover, such a video analysis can only provide information about movements when the head is in view of the camera. In order to overcome these limitations, the novel invention reported here utilises a lightweight (<10g) Inertial Measurement Unit (IMU), positioned on the head of a homing pigeon, which contains a sensor with tri-axial orthogonal accelerometers, gyroscopes, and magnetometers. This highly compact (20.3×12.7×3 mm) system, can be programmed and calibrated to provide measurements of the three rotational angles (roll, pitch and yaw) simultaneously, eliminating any drift, i.e. the movement of the pigeon's head is determined by detecting and estimating the directions of motion at all angles (even those outside the defined areas of tracking). Using an existing VTA approach as a baseline for comparison, it is demonstrated IMU technology can comprehensively track a pigeon’s normal head movements with greater precision and in all 3 axes
    • 

    corecore