118 research outputs found
Joint Distribution and Transitions of Pain and Activity in Critically Ill Patients
Pain and physical function are both essential indices of recovery in
critically ill patients in the Intensive Care Units (ICU). Simultaneous
monitoring of pain intensity and patient activity can be important for
determining which analgesic interventions can optimize mobility and function,
while minimizing opioid harm. Nonetheless, so far, our knowledge of the
relation between pain and activity has been limited to manual and sporadic
activity assessments. In recent years, wearable devices equipped with 3-axis
accelerometers have been used in many domains to provide a continuous and
automated measure of mobility and physical activity. In this study, we
collected activity intensity data from 57 ICU patients, using the Actigraph
GT3X device. We also collected relevant clinical information, including nurse
assessments of pain intensity, recorded every 1-4 hours. Our results show the
joint distribution and state transition of joint activity and pain states in
critically ill patients.Comment: Accepted for Publication in EMBC 202
Human Activity Recognition using Inertial, Physiological and Environmental Sensors: a Comprehensive Survey
In the last decade, Human Activity Recognition (HAR) has become a vibrant
research area, especially due to the spread of electronic devices such as
smartphones, smartwatches and video cameras present in our daily lives. In
addition, the advance of deep learning and other machine learning algorithms
has allowed researchers to use HAR in various domains including sports, health
and well-being applications. For example, HAR is considered as one of the most
promising assistive technology tools to support elderly's daily life by
monitoring their cognitive and physical function through daily activities. This
survey focuses on critical role of machine learning in developing HAR
applications based on inertial sensors in conjunction with physiological and
environmental sensors.Comment: Accepted for Publication in IEEE Access DOI:
10.1109/ACCESS.2020.303771
AI-Enhanced Intensive Care Unit: Revolutionizing Patient Care with Pervasive Sensing
The intensive care unit (ICU) is a specialized hospital space where
critically ill patients receive intensive care and monitoring. Comprehensive
monitoring is imperative in assessing patients conditions, in particular
acuity, and ultimately the quality of care. However, the extent of patient
monitoring in the ICU is limited due to time constraints and the workload on
healthcare providers. Currently, visual assessments for acuity, including fine
details such as facial expressions, posture, and mobility, are sporadically
captured, or not captured at all. These manual observations are subjective to
the individual, prone to documentation errors, and overburden care providers
with the additional workload. Artificial Intelligence (AI) enabled systems has
the potential to augment the patient visual monitoring and assessment due to
their exceptional learning capabilities. Such systems require robust annotated
data to train. To this end, we have developed pervasive sensing and data
processing system which collects data from multiple modalities depth images,
color RGB images, accelerometry, electromyography, sound pressure, and light
levels in ICU for developing intelligent monitoring systems for continuous and
granular acuity, delirium risk, pain, and mobility assessment. This paper
presents the Intelligent Intensive Care Unit (I2CU) system architecture we
developed for real-time patient monitoring and visual assessment
Interpretable Multi-Task Deep Neural Networks for Dynamic Predictions of Postoperative Complications
Accurate prediction of postoperative complications can inform shared
decisions between patients and surgeons regarding the appropriateness of
surgery, preoperative risk-reduction strategies, and postoperative resource
use. Traditional predictive analytic tools are hindered by suboptimal
performance and usability. We hypothesized that novel deep learning techniques
would outperform logistic regression models in predicting postoperative
complications. In a single-center longitudinal cohort of 43,943 adult patients
undergoing 52,529 major inpatient surgeries, deep learning yielded greater
discrimination than logistic regression for all nine complications. Predictive
performance was strongest when leveraging the full spectrum of preoperative and
intraoperative physiologic time-series electronic health record data. A single
multi-task deep learning model yielded greater performance than separate models
trained on individual complications. Integrated gradients interpretability
mechanisms demonstrated the substantial importance of missing data.
Interpretable, multi-task deep neural networks made accurate, patient-level
predictions that harbor the potential to augment surgical decision-making
- …