12,596 research outputs found
How will the Internet of Things enable Augmented Personalized Health?
Internet-of-Things (IoT) is profoundly redefining the way we create, consume,
and share information. Health aficionados and citizens are increasingly using
IoT technologies to track their sleep, food intake, activity, vital body
signals, and other physiological observations. This is complemented by IoT
systems that continuously collect health-related data from the environment and
inside the living quarters. Together, these have created an opportunity for a
new generation of healthcare solutions. However, interpreting data to
understand an individual's health is challenging. It is usually necessary to
look at that individual's clinical record and behavioral information, as well
as social and environmental information affecting that individual. Interpreting
how well a patient is doing also requires looking at his adherence to
respective health objectives, application of relevant clinical knowledge and
the desired outcomes.
We resort to the vision of Augmented Personalized Healthcare (APH) to exploit
the extensive variety of relevant data and medical knowledge using Artificial
Intelligence (AI) techniques to extend and enhance human health to presents
various stages of augmented health management strategies: self-monitoring,
self-appraisal, self-management, intervention, and disease progress tracking
and prediction. kHealth technology, a specific incarnation of APH, and its
application to Asthma and other diseases are used to provide illustrations and
discuss alternatives for technology-assisted health management. Several
prominent efforts involving IoT and patient-generated health data (PGHD) with
respect converting multimodal data into actionable information (big data to
smart data) are also identified. Roles of three components in an evidence-based
semantic perception approach- Contextualization, Abstraction, and
Personalization are discussed
SAFS: A Deep Feature Selection Approach for Precision Medicine
In this paper, we propose a new deep feature selection method based on deep
architecture. Our method uses stacked auto-encoders for feature representation
in higher-level abstraction. We developed and applied a novel feature learning
approach to a specific precision medicine problem, which focuses on assessing
and prioritizing risk factors for hypertension (HTN) in a vulnerable
demographic subgroup (African-American). Our approach is to use deep learning
to identify significant risk factors affecting left ventricular mass indexed to
body surface area (LVMI) as an indicator of heart damage risk. The results show
that our feature learning and representation approach leads to better results
in comparison with others
Machine learning for precision psychiatry
The nature of mental illness remains a conundrum. Traditional disease
categories are increasingly suspected to mis-represent the causes underlying
mental disturbance. Yet, psychiatrists and investigators now have an
unprecedented opportunity to benefit from complex patterns in brain, behavior,
and genes using methods from machine learning (e.g., support vector machines,
modern neural-network algorithms, cross-validation procedures). Combining these
analysis techniques with a wealth of data from consortia and repositories has
the potential to advance a biologically grounded re-definition of major
psychiatric disorders. Within the next 10-20 years, incoming patients could be
stratified into distinct biological subgroups that cut across classical
diagnostic boundaries. In a new era of evidence-based psychiatry tailored to
single patients, objectively measurable endophenotypes could allow for
individualized prediction of early diagnosis, treatment selection, and dosage
adjustment to reduce the burden of disease. This primer aims to introduce
clinicians and researchers to the opportunities and challenges in bringing
machine intelligence into psychiatric practice
Statistical learning and big data applications
The amount of data generated in the field of laboratory medicine has grown to an extent that conventional laboratory information systems (LISs) are struggling to manage and analyze this complex, entangled information (“Big Data”). Statistical learning, a generalized framework from machine learning (ML) and artificial intelligence (AI) is predestined for processing “Big Data” and holds the potential to revolutionize the field of laboratory medicine. Personalized medicine may in particular benefit from AI-based systems, especially when coupled with readily available wearables and smartphones which can collect health data from individual patients and offer new, cost-effective access routes to healthcare for patients worldwide. The amount of personal data collected, however, also raises concerns about patient-privacy and calls for clear ethical guidelines for “Big Data” research, including rigorous quality checks of data and algorithms to eliminate underlying bias and enable transparency. Likewise, novel federated privacy-preserving data processing approaches may reduce the need for centralized data storage. Generative AI-systems including large language models such as ChatGPT currently enter the stage to reshape clinical research, clinical decision-support systems, and healthcare delivery. In our opinion, AI-based systems have a tremendous potential to transform laboratory medicine, however, their opportunities should be weighed against the risks carefully. Despite all enthusiasm, we advocate for stringent added-value assessments, just as for any new drug or treatment. Human experts should carefully validate AI-based systems, including patient-privacy protection, to ensure quality, transparency, and public acceptance. In this opinion paper, data prerequisites, recent developments, chances, and limitations of statistical learning approaches are highlighted
Prescriptive Cluster-Dependent Support Vector Machines with an Application to Reducing Hospital Readmissions
We augment linear Support Vector Machine (SVM) classifiers by adding three
important features: (i) we introduce a regularization constraint to induce a
sparse classifier; (ii) we devise a method that partitions the positive class
into clusters and selects a sparse SVM classifier for each cluster; and (iii)
we develop a method to optimize the values of controllable variables in order
to reduce the number of data points which are predicted to have an undesirable
outcome, which, in our setting, coincides with being in the positive class. The
latter feature leads to personalized prescriptions/recommendations. We apply
our methods to the problem of predicting and preventing hospital readmissions
within 30-days from discharge for patients that underwent a general surgical
procedure. To that end, we leverage a large dataset containing over 2.28
million patients who had surgeries in the period 2011--2014 in the U.S. The
dataset has been collected as part of the American College of Surgeons National
Surgical Quality Improvement Program (NSQIP)
Uniqueness of Medical Data Mining: How the new technologies and data they generate are transforming medicine
The paper describes how the new technologies and data they generate are
transforming medicine. It stresses the uniqueness of heterogeneous medical data
and the ways of dealing with them. It lists different sources that generate big
medical data, their security, legal and ethical issues, as well as machine
learning/AI methods of dealing with them. A unique feature of the paper is use
of case studies to illustrate how the new technologies influence medical
practice
The Bionic Radiologist: avoiding blurry pictures and providing greater insights
Radiology images and reports have long been digitalized. However, the potential of the more than 3.6 billion radiology
examinations performed annually worldwide has largely gone unused in the effort to digitally transform health care. The Bionic
Radiologist is a concept that combines humanity and digitalization for better health care integration of radiology. At a practical
level, this concept will achieve critical goals: (1) testing decisions being made scientifically on the basis of disease probabilities and
patient preferences; (2) image analysis done consistently at any time and at any site; and (3) treatment suggestions that are closely
linked to imaging results and are seamlessly integrated with other information. The Bionic Radiologist will thus help avoiding missed
care opportunities, will provide continuous learning in the work process, and will also allow more time for radiologists’ primary
roles: interacting with patients and referring physicians. To achieve that potential, one has to cope with many implementation
barriers at both the individual and institutional levels. These include: reluctance to delegate decision making, a possible decrease in
image interpretation knowledge and the perception that patient safety and trust are at stake. To facilitate implementation of the
Bionic Radiologist the following will be helpful: uncertainty quantifications for suggestions, shared decision making, changes in
organizational culture and leadership style, maintained expertise through continuous learning systems for training, and role
development of the involved experts. With the support of the Bionic Radiologist, disparities are reduced and the delivery of care is
provided in a humane and personalized fashion
Data Analytics in Operations Management: A Review
Research in operations management has traditionally focused on models for
understanding, mostly at a strategic level, how firms should operate. Spurred
by the growing availability of data and recent advances in machine learning and
optimization methodologies, there has been an increasing application of data
analytics to problems in operations management. In this paper, we review recent
applications of data analytics to operations management, in three major areas
-- supply chain management, revenue management and healthcare operations -- and
highlight some exciting directions for the future.Comment: Forthcoming in Manufacturing & Services Operations Managemen
Medical data processing and analysis for remote health and activities monitoring
Recent developments in sensor technology, wearable computing, Internet of Things (IoT), and wireless communication have given rise to research in ubiquitous healthcare and remote monitoring of human\u2019s health and activities. Health monitoring systems involve processing and analysis of data retrieved from smartphones, smart watches, smart bracelets, as well as various sensors and wearable devices. Such systems enable continuous monitoring of patients psychological and health conditions by sensing and transmitting measurements such as heart rate, electrocardiogram, body temperature, respiratory rate, chest sounds, or blood pressure. Pervasive healthcare, as a relevant application domain in this context, aims at revolutionizing the delivery of medical services through a medical assistive environment and facilitates the independent living of patients. In this chapter, we discuss (1) data collection, fusion, ownership and privacy issues; (2) models, technologies and solutions for medical data processing and analysis; (3) big medical data analytics for remote health monitoring; (4) research challenges and opportunities in medical data analytics; (5) examples of case studies and practical solutions
- …