11,261 research outputs found

    An inclusive survey of contactless wireless sensing: a technology used for remotely monitoring vital signs has the potential to combating COVID-19

    Get PDF
    With the Coronavirus pandemic showing no signs of abating, companies and governments around the world are spending millions of dollars to develop contactless sensor technologies that minimize the need for physical interactions between the patient and healthcare providers. As a result, healthcare research studies are rapidly progressing towards discovering innovative contactless technologies, especially for infants and elderly people who are suffering from chronic diseases that require continuous, real-time control, and monitoring. The fusion between sensing technology and wireless communication has emerged as a strong research candidate choice because wearing sensor devices is not desirable by patients as they cause anxiety and discomfort. Furthermore, physical contact exacerbates the spread of contagious diseases which may lead to catastrophic consequences. For this reason, research has gone towards sensor-less or contactless technology, through sending wireless signals, then analyzing and processing the reflected signals using special techniques such as frequency modulated continuous wave (FMCW) or channel state information (CSI). Therefore, it becomes easy to monitor and measure the subject’s vital signs remotely without physical contact or asking them to wear sensor devices. In this paper, we overview and explore state-of-the-art research in the field of contactless sensor technology in medicine, where we explain, summarize, and classify a plethora of contactless sensor technologies and techniques with the highest impact on contactless healthcare. Moreover, we overview the enabling hardware technologies as well as discuss the main challenges faced by these systems.This work is funded by the scientific and technological research council of Turkey (TÜBITAK) under grand 119E39

    Satellites at work (Space in the seventies)

    Get PDF
    The use of satellites in the areas of communications, meteorology, geodesy, navigation, air traffic control, and earth resources technology is discussed. NASA contributions to various programs are reviewed

    State of the art of audio- and video based solutions for AAL

    Get PDF
    Working Group 3. Audio- and Video-based AAL ApplicationsIt is a matter of fact that Europe is facing more and more crucial challenges regarding health and social care due to the demographic change and the current economic context. The recent COVID-19 pandemic has stressed this situation even further, thus highlighting the need for taking action. Active and Assisted Living (AAL) technologies come as a viable approach to help facing these challenges, thanks to the high potential they have in enabling remote care and support. Broadly speaking, AAL can be referred to as the use of innovative and advanced Information and Communication Technologies to create supportive, inclusive and empowering applications and environments that enable older, impaired or frail people to live independently and stay active longer in society. AAL capitalizes on the growing pervasiveness and effectiveness of sensing and computing facilities to supply the persons in need with smart assistance, by responding to their necessities of autonomy, independence, comfort, security and safety. The application scenarios addressed by AAL are complex, due to the inherent heterogeneity of the end-user population, their living arrangements, and their physical conditions or impairment. Despite aiming at diverse goals, AAL systems should share some common characteristics. They are designed to provide support in daily life in an invisible, unobtrusive and user-friendly manner. Moreover, they are conceived to be intelligent, to be able to learn and adapt to the requirements and requests of the assisted people, and to synchronise with their specific needs. Nevertheless, to ensure the uptake of AAL in society, potential users must be willing to use AAL applications and to integrate them in their daily environments and lives. In this respect, video- and audio-based AAL applications have several advantages, in terms of unobtrusiveness and information richness. Indeed, cameras and microphones are far less obtrusive with respect to the hindrance other wearable sensors may cause to one’s activities. In addition, a single camera placed in a room can record most of the activities performed in the room, thus replacing many other non-visual sensors. Currently, video-based applications are effective in recognising and monitoring the activities, the movements, and the overall conditions of the assisted individuals as well as to assess their vital parameters (e.g., heart rate, respiratory rate). Similarly, audio sensors have the potential to become one of the most important modalities for interaction with AAL systems, as they can have a large range of sensing, do not require physical presence at a particular location and are physically intangible. Moreover, relevant information about individuals’ activities and health status can derive from processing audio signals (e.g., speech recordings). Nevertheless, as the other side of the coin, cameras and microphones are often perceived as the most intrusive technologies from the viewpoint of the privacy of the monitored individuals. This is due to the richness of the information these technologies convey and the intimate setting where they may be deployed. Solutions able to ensure privacy preservation by context and by design, as well as to ensure high legal and ethical standards are in high demand. After the review of the current state of play and the discussion in GoodBrother, we may claim that the first solutions in this direction are starting to appear in the literature. A multidisciplinary 4 debate among experts and stakeholders is paving the way towards AAL ensuring ergonomics, usability, acceptance and privacy preservation. The DIANA, PAAL, and VisuAAL projects are examples of this fresh approach. This report provides the reader with a review of the most recent advances in audio- and video-based monitoring technologies for AAL. It has been drafted as a collective effort of WG3 to supply an introduction to AAL, its evolution over time and its main functional and technological underpinnings. In this respect, the report contributes to the field with the outline of a new generation of ethical-aware AAL technologies and a proposal for a novel comprehensive taxonomy of AAL systems and applications. Moreover, the report allows non-technical readers to gather an overview of the main components of an AAL system and how these function and interact with the end-users. The report illustrates the state of the art of the most successful AAL applications and functions based on audio and video data, namely (i) lifelogging and self-monitoring, (ii) remote monitoring of vital signs, (iii) emotional state recognition, (iv) food intake monitoring, activity and behaviour recognition, (v) activity and personal assistance, (vi) gesture recognition, (vii) fall detection and prevention, (viii) mobility assessment and frailty recognition, and (ix) cognitive and motor rehabilitation. For these application scenarios, the report illustrates the state of play in terms of scientific advances, available products and research project. The open challenges are also highlighted. The report ends with an overview of the challenges, the hindrances and the opportunities posed by the uptake in real world settings of AAL technologies. In this respect, the report illustrates the current procedural and technological approaches to cope with acceptability, usability and trust in the AAL technology, by surveying strategies and approaches to co-design, to privacy preservation in video and audio data, to transparency and explainability in data processing, and to data transmission and communication. User acceptance and ethical considerations are also debated. Finally, the potentials coming from the silver economy are overviewed.publishedVersio

    Unobtrusive Health Monitoring in Private Spaces: The Smart Vehicle

    Get PDF
    Unobtrusive in-vehicle health monitoring has the potential to use the driving time to perform regular medical check-ups. This work intends to provide a guide to currently proposed sensor systems for in-vehicle monitoring and to answer, in particular, the questions: (1) Which sensors are suitable for in-vehicle data collection? (2) Where should the sensors be placed? (3) Which biosignals or vital signs can be monitored in the vehicle? (4) Which purposes can be supported with the health data? We reviewed retrospective literature systematically and summarized the up-to-date research on leveraging sensor technology for unobtrusive in-vehicle health monitoring. PubMed, IEEE Xplore, and Scopus delivered 959 articles. We firstly screened titles and abstracts for relevance. Thereafter, we assessed the entire articles. Finally, 46 papers were included and analyzed. A guide is provided to the currently proposed sensor systems. Through this guide, potential sensor information can be derived from the biomedical data needed for respective purposes. The suggested locations for the corresponding sensors are also linked. Fifteen types of sensors were found. Driver-centered locations, such as steering wheel, car seat, and windscreen, are frequently used for mounting unobtrusive sensors, through which some typical biosignals like heart rate and respiration rate are measured. To date, most research focuses on sensor technology development, and most application-driven research aims at driving safety. Health-oriented research on the medical use of sensor-derived physiological parameters is still of interest

    Neonatal non-contact respiratory monitoring based on real-time infrared thermography

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Monitoring of vital parameters is an important topic in neonatal daily care. Progress in computational intelligence and medical sensors has facilitated the development of smart bedside monitors that can integrate multiple parameters into a single monitoring system. This paper describes non-contact monitoring of neonatal vital signals based on infrared thermography as a new biomedical engineering application. One signal of clinical interest is the spontaneous respiration rate of the neonate. It will be shown that the respiration rate of neonates can be monitored based on analysis of the anterior naris (nostrils) temperature profile associated with the inspiration and expiration phases successively.</p> <p>Objective</p> <p>The aim of this study is to develop and investigate a new non-contact respiration monitoring modality for neonatal intensive care unit (NICU) using infrared thermography imaging. This development includes subsequent image processing (region of interest (ROI) detection) and optimization. Moreover, it includes further optimization of this non-contact respiration monitoring to be considered as physiological measurement inside NICU wards.</p> <p>Results</p> <p>Continuous wavelet transformation based on Debauches wavelet function was applied to detect the breathing signal within an image stream. Respiration was successfully monitored based on a 0.3°C to 0.5°C temperature difference between the inspiration and expiration phases.</p> <p>Conclusions</p> <p>Although this method has been applied to adults before, this is the first time it was used in a newborn infant population inside the neonatal intensive care unit (NICU). The promising results suggest to include this technology into advanced NICU monitors.</p

    Video Respiration Monitoring:Towards Remote Apnea Detection in the Clinic

    Get PDF

    Estimating heart rate and rhythm via 3D motion tracking in depth video

    Get PDF
    Low-cost depth sensors, such as Microsoft Kinect, have potential for non-intrusive, non-contact health monitoring that is robust to ambient lighting conditions. However, captured depth images typically suer from low bit-depth and high acquisition noise, and hence processing them to estimate biometrics is dicult. In this paper, we propose to capture depth video of a human subject using Kinect 2.0 to estimate his/her heart rate and rhythm (regularity); as blood is pumped from the heart to circulate through the head, tiny oscillatory head motion due to Newtonian mechanics can be detected for periodicity analysis. Specifically, we first restore a captured depth video via a joint bit-depth enhancement / denoising procedure, using a graph-signal smoothness prior for regularization. Second, we track an automatically detected head region throughout the depth video to deduce 3D motion vectors. The detected vectors are fed back to the depth restoration module in a loop to ensure that the motion information in two modules are consistent, improving performance of both restoration and motion tracking in the process. Third, the computed 3D motion vectors are projected onto its principal component for 1D signal analysis, composed of trend removal, band-pass filtering, and wavelet-based motion denoising. Finally, the heart rate is estimated via Welch power spectrum analysis, and the heart rhythm is computed via peak detection. Experimental results show accurate estimation of the heart rate and rhythm using our proposed algorithm as compared to rate and rhythm estimated by a portable oximeter

    Video Respiration Monitoring:Towards Remote Apnea Detection in the Clinic

    Get PDF
    corecore