49 research outputs found
Fast T wave detection calibrated by clinical knowledge with annotation of P and T waves
There are limited studies on the automatic detection of T waves in arrhythmic electrocardiogram (ECG) signals. This is perhaps because there is no available arrhythmia dataset with annotated T waves. There is a growing need to develop numerically-efficient algorithms that can accommodate the new trend of battery-driven ECG devices. Moreover, there is also a need to analyze long-term recorded signals in a reliable and time-efficient manner, therefore improving the diagnostic ability of mobile devices and point-of-care technologies.Here, the T wave annotation of the well-known MIT-BIH arrhythmia database is discussed and provided. Moreover, a simple fast method for detecting T waves is introduced. A typical T wave detection method has been reduced to a basic approach consisting of two moving averages and dynamic thresholds. The dynamic thresholds were calibrated using four clinically known types of sinus node response to atrial premature depolarization (compensation, reset, interpolation, and reentry).The determination of T wave peaks is performed and the proposed algorithm is evaluated on two well-known databases, the QT and MIT-BIH Arrhythmia databases. The detector obtained a sensitivity of 97.14% and a positive predictivity of 99.29% over the first lead of the validation databases (total of 221,186 beats).We present a simple yet very reliable T wave detection algorithm that can be potentially implemented on mobile battery-driven devices. In contrast to complex methods, it can be easily implemented in a digital filter design.Mohamed Elgendi, Bjoern Eskofier and Derek Abbot
Achieving Efficient and Realistic Full-Radar Simulations and Automatic Data Annotation by Exploiting Ray Meta Data from a Radar Ray Tracing Simulator
In this work, a novel radar simulation concept for efficiently simulating realistic radar data for range, Doppler, and arbitrary antenna positions is introduced. With the concept, the simulated radar signal can also be automatically annotated by splitting it into multiple parts. Annotations that are almost perfect - including the annotation of exotic effects, such as multi-path - can be produced with this approach. Signal parts originating from different parts of an object can be labelled with it as well. To this end, the computation process used in a Monte Carlo shooting and bouncing rays (SBR) simulator is adapted. By considering the hits of each simulated ray, various meta data can be stored, such as hit position, mesh pointer, object IDs, and many more. This collected meta data can then be utilized to predict path-length changes caused by object motion to obtain Doppler information or to apply specific ray filter rules to obtain radar signals that only fulfil specific conditions, such as multiple bounces, or signals that contain specific object IDs. Using this approach, perfect, and otherwise almost impossible, annotation schemes can be realized
Consensus based framework for digital mobility monitoring
Digital mobility assessment using wearable sensor systems has the potential to capture walking performance in a patient's natural environment. It enables monitoring of health status and disease progression and evaluation of interventions in real-world situations. In contrast to laboratory settings, real-world walking occurs in non-conventional environments and under unconstrained and uncontrolled conditions. Despite the general understanding, there is a lack of agreed definitions about what constitutes real-world walking, impeding the comparison and interpretation of the acquired data across systems and studies. The goal of this study was to obtain expert-based consensus on specific aspects of real-world walking and to provide respective definitions in a common terminological framework. An adapted Delphi method was used to obtain agreed definitions related to real-world walking. In an online survey, 162 participants from a panel of academic, clinical and industrial experts with experience in the field of gait analysis were asked for agreement on previously specified definitions. Descriptive statistics was used to evaluate whether consent (> 75% agreement as defined a priori) was reached. Of 162 experts invited to participate, 51 completed all rounds (31.5% response rate). We obtained consensus on all definitions ("Walking"> 90%, "Purposeful"> 75%, "Real-world"> 90%, "Walking bout"> 80%, "Walking speed"> 75%, "Turning"> 90% agreement) after two rounds. The identification of a consented set of realworld walking definitions has important implications for the development of assessment and analysis protocols, as well as for the reporting and comparison of digital mobility outcomes across studies and systems. The definitions will serve as a common framework for implementing digital and mobile technologies for gait assessment and are an important link for the transition from supervised to unsupervised gait assessment
ECG derived feature combination versus single feature in predicting defibrillation success in out-of-hospital cardiac arrested patients
Objective: Algorithms to predict shock outcome based on ventricular fibrillation (VF) waveform features are potentially useful tool to optimize defibrillation strategy (immediate defibrillation versus cardiopulmonary resuscitation). Researchers have investigated numerous predictive features and classification methods using single VF feature and/or their combinations, however reported predictabilities are not consistent. The purpose of this study was to validate whether combining VF features can enhance the prediction accuracy in comparison to single feature. Approach: The analysis was performed in 3 stages: feature extraction, preprocessing and feature selection and classification. Twenty eight predictive features were calculated on 4s episode of the pre-shock VF signal. The preprocessing included instances normalization and oversampling. Seven machine learning algorithms were employed for selecting the best performin single feature and combination of features using wrapper method: Logistic Regression (LR), Naïve-Bayes (NB), Decision tree (C4.5), AdaBoost.M1 (AB), Support Vector Machine (SVM), Nearest Neighbour (NN) and Random Forest (RF). Evaluation of the algorithms was performed by nested 10 fold cross-validation procedure. Main results: A total of 251 unbalanced first shocks (195 unsuccessful and 56 successful) were oversampled to 195 instances in each class. Performance metric based on average accuracy of feature combination has shown that LR and NB exhibit no improvement, C4.5 and AB an improvement not greater than 1% and SVM, NN and RF an improvement greater than 5% in predicting defibrillation outcome in comparison to the best single feature. Significance: By performing wrapper method to select best performing feature combination the non-linear machine learning strategies (SVM, NN, RF) can improve defibrillation prediction performance
Non-invasive estimation of muscle fibre size from high-density electromyography
Because of the biophysical relation between muscle fibre diameter and the propagation velocity of action potentials along the muscle fibres, motor unit conduction velocity could be a non-invasive index of muscle fibre size in humans. However, the relation between motor unit conduction velocity and fibre size has been only assessed indirectly in animal models and in human patients with invasive intramuscular EMG recordings, or it has been mathematically derived from computer simulations. By combining advanced non-invasive techniques to record motor unit activity in vivo, i.e. high-density surface EMG, with the gold standard technique for muscle tissue sampling, i.e. muscle biopsy, here we investigated the relation between the conduction velocity of populations of motor units identified from the biceps brachii muscle, and muscle fibre diameter. We demonstrate the possibility of predicting muscle fibre diameter (R2 = 0.66) and cross-sectional area (R2 = 0.65) from conduction velocity estimates with low systematic bias (∼2% and ∼4% respectively) and a relatively low margin of individual error (∼8% and ∼16%, respectively). The proposed neuromuscular interface opens new perspectives in the use of high-density EMG as a non-invasive tool to estimate muscle fibre size without the need of surgical biopsy sampling. The non-invasive nature of high-density surface EMG for the assessment of muscle fibre size may be useful in studies monitoring child development, ageing, space and exercise physiology, although the applicability and validity of the proposed methodology need to be more directly assessed in these specific populations by future studies
Technology in Parkinson's disease:challenges and opportunities
The miniaturization, sophistication, proliferation, and accessibility of technologies are enabling the capture of more and previously inaccessible phenomena in Parkinson's disease (PD). However, more information has not translated into a greater understanding of disease complexity to satisfy diagnostic and therapeutic needs. Challenges include noncompatible technology platforms, the need for wide-scale and long-term deployment of sensor technology (among vulnerable elderly patients in particular), and the gap between the "big data" acquired with sensitive measurement technologies and their limited clinical application. Major opportunities could be realized if new technologies are developed as part of open-source and/or open-hardware platforms that enable multichannel data capture sensitive to the broad range of motor and nonmotor problems that characterize PD and are adaptable into self-adjusting, individualized treatment delivery systems. The International Parkinson and Movement Disorders Society Task Force on Technology is entrusted to convene engineers, clinicians, researchers, and patients to promote the development of integrated measurement and closed-loop therapeutic systems with high patient adherence that also serve to (1) encourage the adoption of clinico-pathophysiologic phenotyping and early detection of critical disease milestones, (2) enhance the tailoring of symptomatic therapy, (3) improve subgroup targeting of patients for future testing of disease-modifying treatments, and (4) identify objective biomarkers to improve the longitudinal tracking of impairments in clinical care and research. This article summarizes the work carried out by the task force toward identifying challenges and opportunities in the development of technologies with potential for improving the clinical management and the quality of life of individuals with PD. © 2016 International Parkinson and Movement Disorder Society
Correction to: Assessing real-world gait with digital technology? Validation, insights and recommendations from the Mobilise-D consortium (<em>Journal of NeuroEngineering and Rehabilitation</em>, (2023), 20, 1, (78), 10.1186/s12984-023-01198-5)
\ua9 The Author(s) 2024.Following publication of the original article [1], the author noticed the errors in Table 1, and in Discussion section. In Table 1 under Metric (Gait sequence detection) column, the algorithms GSDB was updated with wrong description, input, output, language and citation and GSDc with wrong description has been corrected as shown below: (Table presented.) Description of algorithms for each metric: gait sequence detection (GSD), initial contact event detection (ICD), cadence estimation (CAD) and stride length estimation (SL) Metric Name Description Input Output Language References GSDA Based on a frequency-based approach, this algorithm is implemented on the vertical and anterior–posterior acceleration signals. First, these are band pass filtered to keep frequencies between 0.5 and 3 Hz. Next, a convolution of a 2 Hz sinewave (representing a template for a gait cycle) is performed, from which local maxima will be detected to define the regions of gait acc_v: vertical acceleration acc_ap: anterior–posterior acceleration WinS = 3 s; window size for convolution OL = 1.5 s; overlap of windows Activity_thresh = 0.01; Motion threshold Fs: sampling frequency Start: beginning of N gait sequences [s] relative to the start of a recording or a test/trial. Format: 1
7 N vector End: termination of N gait sequences [s] relative to the start of a recording or a test/trial. Format: 1
7 N vector Matlab\uae Iluz, Gazit [40] GSDB This algorithm, based on a time domain-approach, detects the gait periods based on identified steps. First, the norm of triaxial acceleration signal is low-pass filtered (FIR, fc = 3.2 Hz), then a peak detection procedure using a threshold of 0.1 [g] is applied to identify steps. Consecutive steps, detected using an adaptive step duration threshold are associated to gait sequences acc_norm: norm of the 3D-accelerometer signal Fs: sampling frequency th: peak detection threshold: 0.1 (g) Start: beginning of N gait sequences [s] relative to the start of a recording or a test/trial. Format: 1
7 N vector End: termination of N gait sequences [s] relative to the start of a recording or a test/trial. Format: 1
7 N vector Matlab\uae Paraschiv-Ionescu, Newman [41] GSDc This algorithm utilizes the same approach as GSDBthe only difference being a different threshold for peak detection of 0.15 [g] acc_norm: norm of the 3D-accelerometer signal Fs: sampling frequency th: peak detection threshold: 0.15 (g) Start: beginning of N gait sequences [s] relative to the start of a recording or a test/trial. Format: 1
7 N vector End: termination of N gait sequences [s] relative to the start of a recording or a test/trial. Format: 1
7 N vector Matlab\uae Paraschiv-Ionescu, Newman [41] In Discussion section, the paragraph should read as "Based on our findings collectively, we recommend using GSDB on cohorts with slower gait speeds and substantial gait impairments (e.g., proximal femoral fracture). This may be because this algorithm is based on the acceleration norm (overall accelerometry signal rather than a specific axis/direction (e.g., vertical), hence it is more robust to sensor misalignments that are common in unsupervised real-life settings. Moreover, the use of adaptive threshold, that are derived from the features of a subject’s data and applied to step duration for detection of steps belonging to gait sequences, allows increased robustness of the algorithm to irregular and unstable gait patterns" instead of “Based on our findings collectively, we recommend using GSDB on cohorts with slower gait speeds and substantial gait impairments (e.g., proximal femoral fracture). This may be because this algorithm is based on the acceleration norm (overall accelerometry signal rather than a specific axis/direction (e.g., vertical), hence it is more robust to sensor misalignments that are common in unsupervised real-life settings [41]. Moreover, the use of adaptive thresholds, that are derived from the features of a subject’s data and applied to the amplitude of acceleration norm and to step duration for detection of steps belonging to gait sequences, allows increased robustness of the algorithm to irregular and unstable gait patterns”