14,740 research outputs found
AI-Enhanced Intensive Care Unit: Revolutionizing Patient Care with Pervasive Sensing
The intensive care unit (ICU) is a specialized hospital space where
critically ill patients receive intensive care and monitoring. Comprehensive
monitoring is imperative in assessing patients conditions, in particular
acuity, and ultimately the quality of care. However, the extent of patient
monitoring in the ICU is limited due to time constraints and the workload on
healthcare providers. Currently, visual assessments for acuity, including fine
details such as facial expressions, posture, and mobility, are sporadically
captured, or not captured at all. These manual observations are subjective to
the individual, prone to documentation errors, and overburden care providers
with the additional workload. Artificial Intelligence (AI) enabled systems has
the potential to augment the patient visual monitoring and assessment due to
their exceptional learning capabilities. Such systems require robust annotated
data to train. To this end, we have developed pervasive sensing and data
processing system which collects data from multiple modalities depth images,
color RGB images, accelerometry, electromyography, sound pressure, and light
levels in ICU for developing intelligent monitoring systems for continuous and
granular acuity, delirium risk, pain, and mobility assessment. This paper
presents the Intelligent Intensive Care Unit (I2CU) system architecture we
developed for real-time patient monitoring and visual assessment
Towards responsive Sensitive Artificial Listeners
This paper describes work in the recently started project SEMAINE, which aims to build a set of Sensitive Artificial Listeners â conversational agents designed to sustain an interaction with a human user despite limited verbal skills, through robust recognition and generation of non-verbal behaviour in real-time, both when the agent is speaking and listening. We report on data collection and on the design of a system architecture in view of real-time responsiveness
Sharing Human-Generated Observations by Integrating HMI and the Semantic Sensor Web
Current âInternet of Thingsâ concepts point to a future where connected objects gather meaningful information about their environment and share it with other objects and people. In particular, objects embedding Human Machine Interaction (HMI), such as mobile devices and, increasingly, connected vehicles, home appliances, urban interactive infrastructures, etc., may not only be conceived as sources of sensor information, but, through interaction with their users, they can also produce highly valuable context-aware human-generated observations. We believe that the great promise offered by combining and sharing all of the different sources of information available can be realized through the integration of HMI and Semantic Sensor Web technologies. This paper presents a technological framework that harmonizes two of the most influential HMI and Sensor Web initiatives: the W3Câs Multimodal Architecture and Interfaces (MMI) and the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) with its semantic extension, respectively. Although the proposed framework is general enough to be applied in a variety of connected objects integrating HMI, a particular development is presented for a connected car scenario where driversâ observations about the traffic or their environment are shared across the Semantic Sensor Web. For implementation and evaluation purposes an on-board OSGi (Open Services Gateway Initiative) architecture was built, integrating several available HMI, Sensor Web and Semantic Web technologies. A technical performance test and a conceptual validation of the scenario with potential users are reported, with results suggesting the approach is soun
Towards Vision-Based Smart Hospitals: A System for Tracking and Monitoring Hand Hygiene Compliance
One in twenty-five patients admitted to a hospital will suffer from a
hospital acquired infection. If we can intelligently track healthcare staff,
patients, and visitors, we can better understand the sources of such
infections. We envision a smart hospital capable of increasing operational
efficiency and improving patient care with less spending. In this paper, we
propose a non-intrusive vision-based system for tracking people's activity in
hospitals. We evaluate our method for the problem of measuring hand hygiene
compliance. Empirically, our method outperforms existing solutions such as
proximity-based techniques and covert in-person observational studies. We
present intuitive, qualitative results that analyze human movement patterns and
conduct spatial analytics which convey our method's interpretability. This work
is a step towards a computer-vision based smart hospital and demonstrates
promising results for reducing hospital acquired infections.Comment: Machine Learning for Healthcare Conference (MLHC
A pervasive approach to a real-time intelligent decision support system in intensive medicine
The decision on the most appropriate procedure to provide to the
patients the best healthcare possible is a critical and complex task in Intensive
Care Units (ICU). Clinical Decision Support Systems (CDSS) should deal with
huge amounts of data and online monitoring, analyzing numerous parameters
and providing outputs in a short real-time. Although the advances attained in
this area of knowledge new challenges should be taken into account in future
CDSS developments, principally in ICUs environments. The next generation of
CDSS will be pervasive and ubiquitous providing the doctors with the
appropriate services and information in order to support decisions regardless the
time or the local where they are. Consequently new requirements arise namely
the privacy of data and the security in data access. This paper will present a
pervasive perspective of the decision making process in the context of INTCare
system, an intelligent decision support system for intensive medicine. Three
scenarios are explored using data mining models continuously assessed and
optimized. Some preliminary results are depicted and discussed.Fundação para a CiĂȘncia e a Tecnologia (FCT
- âŠ