1,280 research outputs found

    Automatic Infant Respiration Estimation from Video: A Deep Flow-based Algorithm and a Novel Public Benchmark

    Full text link
    Respiration is a critical vital sign for infants, and continuous respiratory monitoring is particularly important for newborns. However, neonates are sensitive and contact-based sensors present challenges in comfort, hygiene, and skin health, especially for preterm babies. As a step toward fully automatic, continuous, and contactless respiratory monitoring, we develop a deep-learning method for estimating respiratory rate and waveform from plain video footage in natural settings. Our automated infant respiration flow-based network (AIRFlowNet) combines video-extracted optical flow input and spatiotemporal convolutional processing tuned to the infant domain. We support our model with the first public annotated infant respiration dataset with 125 videos (AIR-125), drawn from eight infant subjects, set varied pose, lighting, and camera conditions. We include manual respiration annotations and optimize AIRFlowNet training on them using a novel spectral bandpass loss function. When trained and tested on the AIR-125 infant data, our method significantly outperforms other state-of-the-art methods in respiratory rate estimation, achieving a mean absolute error of \sim2.9 breaths per minute, compared to \sim4.7--6.2 for other public models designed for adult subjects and more uniform environments

    Multispectral Video Fusion for Non-contact Monitoring of Respiratory Rate and Apnea

    Full text link
    Continuous monitoring of respiratory activity is desirable in many clinical applications to detect respiratory events. Non-contact monitoring of respiration can be achieved with near- and far-infrared spectrum cameras. However, current technologies are not sufficiently robust to be used in clinical applications. For example, they fail to estimate an accurate respiratory rate (RR) during apnea. We present a novel algorithm based on multispectral data fusion that aims at estimating RR also during apnea. The algorithm independently addresses the RR estimation and apnea detection tasks. Respiratory information is extracted from multiple sources and fed into an RR estimator and an apnea detector whose results are fused into a final respiratory activity estimation. We evaluated the system retrospectively using data from 30 healthy adults who performed diverse controlled breathing tasks while lying supine in a dark room and reproduced central and obstructive apneic events. Combining multiple respiratory information from multispectral cameras improved the root mean square error (RMSE) accuracy of the RR estimation from up to 4.64 monospectral data down to 1.60 breaths/min. The median F1 scores for classifying obstructive (0.75 to 0.86) and central apnea (0.75 to 0.93) also improved. Furthermore, the independent consideration of apnea detection led to a more robust system (RMSE of 4.44 vs. 7.96 breaths/min). Our findings may represent a step towards the use of cameras for vital sign monitoring in medical applications

    Signal Processing Approaches for Cardio-Respiratory Biosignals with an Emphasis on Mobile Health Applications

    Get PDF
    We humans are constantly preoccupied with our health and physiological status. From precise measurements such as the 12-lead electrocardiograms recorded in hospitals, we have moved on to mobile acquisition devices, now as versatile as smart-watches and smart-phones. Established signal processing techniques do not cater to the particularities of mobile biomedical health monitoring applications. Moreover, although our capabilities to acquire data are growing, many underlying physiological phenomena remain poorly understood. This thesis focuses on two aspects of biomedical signal processing. First, we investigate the physiological basis of the relationship between cardiac and breathing biosignals. Second, we propose a methodology to understand and use this relationship in health monitoring applications. Part I of this dissertation examines the physiological background of the cardio-respiratory relationship and indexes based on this relationship. We propose a methodology to extract the respiratory sinus arrhythmia (RSA), which is an important aspect of this relationship. Furthermore, we propose novel indexes incorporating dynamics of the cardio-respiratory relationship, using the RSA and the phase lag between RSA and breathing. We then evaluate, systematically, existing and novel indexes under known autonomic stimuli. We demonstrate our indexes to be viable additions to the existing ones, thanks to their performance and physiological merits. Part II focuses on real-time and instantaneous methods for the estimation of the breathing parameters from cardiac activity, which is an important application of the cardio-respiratory relationship. The breathing rate is estimated from electrocardiogram and imaging photoplethysmogram recordings, using two dedicated filtering schemes, one of which is novel. Our algorithm measures this important vital rhythm in a truly real-time manner, with significantly shorter delays than existing methods. Furthermore, we identify situations, in which an important assumption regarding the estimation of breathing parameters from cardiac activity does not hold, and draw a road-map to overcome this problem. In Part III, we use indexes and methodology developed in Parts I and II in two applications for mobile health monitoring, namely, emotion recognition and sleep apnea detection from cardiac and breathing biosignals. Results on challenging datasets show that the cardio-respiratory indexes introduced in the present thesis, especially those related to the phase lag between RSA and breathing, are successful for emotion recognition and sleep apnea detection. The novel indexes reveal to be complementary to previous ones, and bring additional insight into the physiological basis of emotions and apnea episodes. To summarize, the techniques proposed in this thesis help to bypass shortcomings of previous approaches in the understanding and the estimation of cardio-respiratory coupling in real-life mobile health monitoring

    Remote Photoplethysmography in Infrared - Towards Contactless Sleep Monitoring

    Get PDF

    Real-Time Respiratory Rate Estimation using Imaging Photoplethysmography Inter-Beat Intervals

    Get PDF
    Imaging photoplethysmography (iPPG) has emerged as a contactless heart-rate monitoring technique. As the respiratory activity modulates the heart rate, we investigate the accuracy of iPPG in conveying the inter-beat variation due to the respiratory modulation of the heart rate. The instantaneous respiratory rate was estimated in real-time from the iPPG inter-beat variations with an algorithm based on a bank of short FIR notch filters. The comparison of the iPPG-based respiratory rate estimates to ECG-based estimates showed that the iPPG ones were only slightly less accurate in spite of the challenging conditions related to this contacless technique

    Models and Analysis of Vocal Emissions for Biomedical Applications

    Get PDF
    The International Workshop on Models and Analysis of Vocal Emissions for Biomedical Applications (MAVEBA) came into being in 1999 from the particularly felt need of sharing know-how, objectives and results between areas that until then seemed quite distinct such as bioengineering, medicine and singing. MAVEBA deals with all aspects concerning the study of the human voice with applications ranging from the neonate to the adult and elderly. Over the years the initial issues have grown and spread also in other aspects of research such as occupational voice disorders, neurology, rehabilitation, image and video analysis. MAVEBA takes place every two years always in Firenze, Italy. This edition celebrates twenty years of uninterrupted and succesfully research in the field of voice analysis

    Impact of body position on imaging ballistocardiographic signals

    Get PDF
    Current works direct at the unobtrusive acquisition of vital parameters from videos. The most common approach exploits subtle color variations. The analysis of cardiovascular induced motion from videos (imaging ballistocardiography, iBCG) is another approach that can supplement the analysis of color changes. The presented study systematically investigates the impact of body position (supine vs. upright) on iBCG. Our research directs at heart rate estimation by iBCG and on the possibility to analyse ballistocardiographic waveforms from iBCG. We use own data from 30 healthy volunteers, who went through repeated orthostatic maneuvers on a tilt table. Processing is done according to common procedures for iBCG processing including feature tracking, dimensionality reduction and bandpass filtering. Our results indicate that heart rate estimation works well in supine position (root mean square error of heart rate estimation 5.68 beats per minute). The performance drastically degrades in upri ght (standing) position (root mean square error of heart rate estimation 21.20 beats per minute). With respect to analysis of beat waveforms, we found large intra-subject and inter-subject variations. Only in few cases, the resulting waveform closely resembles the ideal ballistocardiographic waveform. Our investigation indicates that the actual position has a large effect on iBCG and should be considered in algorithmic developments and testing

    Continuous Camera-Based Premature-Infant Monitoring Algorithms for NICU

    Get PDF
    Non-contact visual monitoring of vital signs in neonatology has been demonstrated by several recent studies in ideal scenarios where the baby is calm and there is no medical or parental intervention. Similar to contact monitoring methods (e.g., ECG, pulse oximeter) the camera-based solutions suffer from motion artifacts. Therefore, during care and the infants’ active periods, calculated values typically differ largely from the real ones. In this way, our main contribution to existing remote camera-based techniques is to detect and classify such situations with a high level of confidence. Our algorithms can not only evaluate quiet periods, but can also provide continuous monitoring. Altogether, our proposed algorithms can measure pulse rate, breathing rate, and to recognize situations such as medical intervention or very active subjects using only a single camera, while the system does not exceed the computational capabilities of average CPU-GPU-based hardware. The performance of the algorithms was evaluated on our database collected at the Ist Dept. of Neonatology of Pediatrics, Dept of Obstetrics and Gynecology, Semmelweis University, Budapest, Hungary
    corecore