1,376 research outputs found
Functional data analytic approach of modeling ECG T-wave shape to measure cardiovascular behavior
The T-wave of an electrocardiogram (ECG) represents the ventricular
repolarization that is critical in restoration of the heart muscle to a
pre-contractile state prior to the next beat. Alterations in the T-wave reflect
various cardiac conditions; and links between abnormal (prolonged) ventricular
repolarization and malignant arrhythmias have been documented. Cardiac safety
testing prior to approval of any new drug currently relies on two points of the
ECG waveform: onset of the Q-wave and termination of the T-wave; and only a few
beats are measured. Using functional data analysis, a statistical approach
extracts a common shape for each subject (reference curve) from a sequence of
beats, and then models the deviation of each curve in the sequence from that
reference curve as a four-dimensional vector. The representation can be used to
distinguish differences between beats or to model shape changes in a subject's
T-wave over time. This model provides physically interpretable parameters
characterizing T-wave shape, and is robust to the determination of the endpoint
of the T-wave. Thus, this dimension reduction methodology offers the strong
potential for definition of more robust and more informative biomarkers of
cardiac abnormalities than the QT (or QT corrected) interval in current use.Comment: Published in at http://dx.doi.org/10.1214/09-AOAS273 the Annals of
Applied Statistics (http://www.imstat.org/aoas/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Data compression and regression based on local principal curves.
Frequently the predictor space of a multivariate regression problem of the type y = m(x_1, …, x_p ) + ε is intrinsically one-dimensional, or at least of far lower dimension than p. Usual modeling attempts such as the additive model y = m_1(x_1) + … + m_p (x_p ) + ε, which try to reduce the complexity of the regression problem by making additional structural assumptions, are then inefficient as they ignore the inherent structure of the predictor space and involve complicated model and variable selection stages. In a fundamentally different approach, one may consider first approximating the predictor space by a (usually nonlinear) curve passing through it, and then regressing the response only against the one-dimensional projections onto this curve. This entails the reduction from a p- to a one-dimensional regression problem.
As a tool for the compression of the predictor space we apply local principal curves. Taking things on from the results presented in Einbeck et al. (Classification – The Ubiquitous Challenge. Springer, Heidelberg, 2005, pp. 256–263), we show how local principal curves can be parametrized and how the projections are obtained. The regression step can then be carried out using any nonparametric smoother. We illustrate the technique using data from the physical sciences
Two curve Chebyshev approximation and its application to signal clustering
In this paper we extend a number of important results of the classical
Chebyshev approximation theory to the case of simultaneous approximation of two
or more functions. The need for this extension is application driven, since
such kind of problems appears in the area of curve (signal) clustering. In this
paper we propose a new efficient algorithm for signal clustering and develop a
procedure that allows one to reuse the results obtained at the previous
iteration without recomputing the cluster centres from scratch. This approach
is based on the extension of the classical de la Vallee-Poussin's procedure
originally developed for polynomial approximation. In this paper, we also
develop necessary and sufficient optimality conditions for two curve Chebyshev
approximation, that is our core tool for curve clustering. These results are
based on application of nonsmooth convex analysis
A binary level set method based on k-Means for contour tracking on skin cancer images
A great challenge of research and development activities have recently highlighted in segmenting of the skin cancer images. This paper presents a novel algorithm to improve the segmentation results of level set algorithm with skin cancer images. The major contribution of presented algorithm is to simplify skin cancer images for the computer aided object analysis without loss of significant information and to decrease the required computational cost. The presented algorithm uses k-means clustering technique and explores primitive segmentation to get initial label estimation for level set algorithm. The proposed segmentation method provides better segmentation results as compared to standard level set segmentation technique and modified fuzzy cmeans clustering technique
Data compression and regression based on local principal curves
Frequently the predictor space of a multivariate regression problem of the type y = m(x_1, …, x_p ) + ε is intrinsically one-dimensional, or at least of far lower dimension than p. Usual modeling attempts such as the additive model y = m_1(x_1) + … + m_p (x_p ) + ε, which try to reduce the complexity of the regression problem by making additional structural assumptions, are then inefficient as they ignore the inherent structure of the predictor space and involve complicated model and variable selection stages. In a fundamentally different approach, one may consider first approximating the predictor space by a (usually nonlinear) curve passing through it, and then regressing the response only against the one-dimensional projections onto this curve. This entails the reduction from a p- to a one-dimensional regression problem. As a tool for the compression of the predictor space we apply local principal curves. Taking things on from the results presented in Einbeck et al. (Classification – The Ubiquitous Challenge. Springer, Heidelberg, 2005, pp. 256–263), we show how local principal curves can be parametrized and how the projections are obtained. The regression step can then be carried out using any nonparametric smoother. We illustrate the technique using data from the physical sciences
Recommended from our members
Advancing Artificial Intelligence in Sensors, Signals, and Imaging Informatics.
ObjectiveTo identify research works that exemplify recent developments in the field of sensors, signals, and imaging informatics.MethodA broad literature search was conducted using PubMed and Web of Science, supplemented with individual papers that were nominated by section editors. A predefined query made from a combination of Medical Subject Heading (MeSH) terms and keywords were used to search both sources. Section editors then filtered the entire set of retrieved papers with each paper having been reviewed by two section editors. Papers were assessed on a three-point Likert scale by two section editors, rated from 0 (do not include) to 2 (should be included). Only papers with a combined score of 2 or above were considered.ResultsA search for papers was executed at the start of January 2019, resulting in a combined set of 1,459 records published in 2018 in 119 unique journals. Section editors jointly filtered the list of candidates down to 14 nominations. The 14 candidate best papers were then ranked by a group of eight external reviewers. Four papers, representing different international groups and journals, were selected as the best papers by consensus of the International Medical Informatics Association (IMIA) Yearbook editorial board.ConclusionsThe fields of sensors, signals, and imaging informatics have rapidly evolved with the application of novel artificial intelligence/machine learning techniques. Studies have been able to discover hidden patterns and integrate different types of data towards improving diagnostic accuracy and patient outcomes. However, the quality of papers varied widely without clear reporting standards for these types of models. Nevertheless, a number of papers have demonstrated useful techniques to improve the generalizability, interpretability, and reproducibility of increasingly sophisticated models
A new family of high-resolution multivariate spectral estimators
In this paper, we extend the Beta divergence family to multivariate power
spectral densities. Similarly to the scalar case, we show that it smoothly
connects the multivariate Kullback-Leibler divergence with the multivariate
Itakura-Saito distance. We successively study a spectrum approximation problem,
based on the Beta divergence family, which is related to a multivariate
extension of the THREE spectral estimation technique. It is then possible to
characterize a family of solutions to the problem. An upper bound on the
complexity of these solutions will also be provided. Simulations suggest that
the most suitable solution of this family depends on the specific features
required from the estimation problem
- …