311 research outputs found

    Medical Information Representation Framework for Mobile Healthcare

    Get PDF
    In mobile healthcare, medical information are often expressed in different formats due to the local policies and regulations and the heterogeneity of the applications, systems, and the adopted Information and communication technology. This chapter describes a framework which enables medical information, in particular clinical vital signs and professional annotations, be processed, exchanged, stored and managed modularly and flexibly in a mobile, distributed and heterogeneous environment despite the diversity of the formats used to represent the information. To deal with medical information represented in multiple formats the authors adopt techniques and constructs similar to the ones used on the Internet, in particular, the authors are inspired by the constructs used in multi-media e-mail and audio-visual data streaming standards. They additionally make a distinction of the syntax for data transfer and store from the syntax for expressing medical domain concepts. In this way, they separate the concerns of what to process, exchange and store from how the information can be encoded or transcoded for transfer over the internet. The authors use an object oriented information model to express the domain concepts and their relations while briefly illustrate how framework tools can be used to encode vital sign data for exchange and store in a distributed and heterogeneous environment

    A Feasibility Study in Measuring Soft Tissue Artifacts on the Upper Leg Using Inertial and Magnetic Sensors

    Get PDF
    Soft-tissue artifacts cause inaccurate estimates of body segment orientations. The inertial sensor (or optical marker) is orientating (or displacing) with respect to the bone it has to measure, due to muscle and skin movement [1]. In this pilot study 11 inertial and magnetic sensors (MTw, Xsens Technologies) were placed on the rectus femoris, vastus medialis and vastus lateralis (upper leg). One sensor was positioned on the tendon plate behind the quadriceps (iliotibial tract, as used in Xsens MVN [1]) and used as reference sensor. Walking, active and passive knee extensions and muscle contractions without flexion/extension were recorded using one subject. The orientation of each sensor with respect to the reference sensor was calculated. During walking, relative orientations of up to 28.6Âș were measured (22.4±3.6Âș). During muscle contractions without flexion/extension the largest relative orientations were measured on the rectus femoris (up to 11.1Âș) [2]. This pilot showed that the ambulatory measurement of deformation of the upper leg is feasible; however, improving the measurement technology is required. We therefore have designed a new inertial and magnetic sensor system containing smaller sensors, based on the design of an instrumented glove for the assessment of hand kinematics [3]. This new sensor system will then be used to investigate soft-tissue artifacts more accurately; in particular we will focus on in-use estimation and elimination of these artifacts

    Instrumental support in the physical activity community - premilinary results

    Get PDF
    Currently, we witness the growth of ICT-mediated solutions for chronic diseases management, especially to assist and support patients in lifestyle changes in order to improve their health condition. Being physically active is one the recommended lifestyle changes for patients with chronic diseases. The challenge within those ICT-mediated solutions for physical activity support is to allow patients to manage themselves their physical activity level (PAL) and provide them with the needed social support. One of those solutions available is the use of Virtual Community (VC)

    Eating Event Recognition Using Accelerometer, Gyroscope, Piezoelectric, and Lung Volume Sensors

    Get PDF
    In overcoming the worldwide problem of overweight and obesity, automatic dietary monitoring (ADM) is introduced as support in dieting practises. ADM aims to automatically, continuously, and objectively measure dimensions of food intake in a free-living environment. This could simplify the food registration process, thereby overcoming frequent memory, underestimation, and overestimation problems. In this study, an eating event detection sensor system was developed comprising a smartwatch worn on the wrist containing an accelerometer and gyroscope for eating gesture detection, a piezoelectric sensor worn on the jaw for chewing detection, and a respiratory inductance plethysmographic sensor consisting of two belts worn around the chest and abdomen for food swallowing detection. These sensors were combined to determine to what extent a combination of sensors focusing on different steps of the dietary cycle can improve eating event classification results. Six subjects participated in an experiment in a controlled setting consisting of both eating and non-eating events. Features were computed for each sensing measure to train a support vector machine model. This resulted in F1-scores of 0.82 for eating gestures, 0.94 for chewing food, and 0.58 for swallowing food

    Klauwgezondheid

    Get PDF
    De schade per koe, afgevoerd vanwege been- en klauwgebreken bedraagt, bij een gemiddelde gebruiksduur van de veestapel 500 gulden

    On-body inertial sensor location recognition

    Get PDF
    Introduction and past research:\ud In previous work we presented an algorithm for automatically identifying the body segment to which an inertial sensor is attached during walking [1]. Using this method, the set-up of inertial motion capture systems becomes easier and attachment errors are avoided. The user can place (wireless) inertial sensors on arbitrary body segments. Then, after walking for a few steps, the segment to which each sensor is attached is identified automatically. To classify the sensors, a decision tree was trained using ranked features extracted from magnitudes, x- y- and z-components of accelerations, angular velocities and angular accelerations. \ud \ud Method:\ud Drawback of using ranking and correlation coefficients as features is that information from different sensors needs to be combined. Therefore we started looking into a new method using the same data and the same extracted features as in [1], but without using the ranking and the correlation coefficients between different sensors. Instead of a decision tree, we used logistic regression for classifying the sensors [2]. Unlike decision trees, with logistic regression a probability is calculated for each body part on which the sensor can be placed. To develop a method that works for different activities of daily living, we recorded 18 activities of ten healthy subjects using 17 inertial sensors. Walking at different speeds, sit to stand, lying down, grasping objects, jumping, walking stairs and cycling were recorded. The goal is – based on the data of single sensor — to predict the body segment to which this sensor is attached, for different activities of daily living. \ud \ud Results:\ud A logistic regression classifier was developed and tested with 10-fold crossvalidation using 31 walking trials of ten healthy subjects. In the case of a full-body configuration 482 of a total of 527 (31 x 17) sensors were correctly classified (91.5%). \ud \ud Discussion:\ud Using our algorithm it is possible to create an intelligent sensor, which can determine its own location on the body. The data of the measurements of different daily-life activities is currently being analysed. In addition, we will look into the possibility of simultaneously predicting the on-body location of each sensor and the performed activity

    Gait analysis using ultrasound and inertial sensors

    Get PDF
    Introduction and past research:\ud Inertial sensors are great for orientation estimation, but they cannot measure relative positions of human body segments directly. In previous work we used ultrasound to estimate distances between body segments [1]. In [2] we presented an easy to use system for gait analysis in clinical practice but also in-home situations. Ultrasound range estimates were fused with data from foot-mounted inertial sensors, using an extended Kalman filter, for 3D (relative) position and orientation estimation of the feet.\ud \ud Validation:\ud From estimated 3D positions we calculated step lengths and stride widths and compared this to an optical reference system for validation. Mean (±standard deviation) of absolute differences was 1.7 cm (±1.8 cm) for step lengths and 1.2 cm (±1.2 cm) for stride widths when comparing 54 walking trials of three healthy subjects.\ud \ud Clinical application:\ud Next, the system presented in [2] was used in the INTERACTION project, for measuring eight stroke subjects during a 10 m walk test [3]. Step lengths, stride widths and stance and swing times were compared with the Berg balance scale score. The first results showed a correlation between step lengths and Berg balance scale scores. To draw real conclusions, more patients and also different activities will be investigated next.\ud \ud Future work:\ud In future work we will extend the system with inertial sensors on the upperand lower legs and the pelvis, to be able to calculate a closed loop and improve the estimation of joint angles compared with systems containing only inertial sensors
    • 

    corecore