461 research outputs found

    Latent variable analysis of causal interactions in atrial fibrillation electrograms

    Get PDF
    Multi-channel intracardiac electrocardiograms of atrial fibrillation (AF) patients are acquired at the electrophysiology laboratory in order to guide radiofrequency (RF) ablation surgery. Unfortunately, the success rate of RF ablation is still moderate, since the mechanisms underlying AF initiation and maintenance are still not precisely known. In this paper, we use an advanced machine learning model, the Gaussian process latent force model (GP-LFM), to infer the relationship between the observed signals and the unknown (latent or exogenous) sources causing them. The resulting GP-LFM provides valuable information about signal generation and propagation inside the heart, and can then be used to perform causal analysis. Results on realistic synthetic signals, generated using the FitzHugh-Nagumo model, are used to showcase the potential of the proposed approach

    Efficient Monte Carlo optimization for multi-label classifier chains

    Get PDF
    Multi-label classification (MLC) is the supervised learning problem where an instance may be associated with multiple labels. Modeling dependencies between labels allows MLC methods to improve their performance at the expense of an increased computational cost. In this paper we focus on the classifier chains (CC) approach for modeling dependencies. On the one hand, the original CC algorithm makes a greedy approximation, and is fast but tends to propagate errors down the chain. On the other hand, a recent Bayes-optimal method improves the performance, but is computationally intractable in practice. Here we present a novel double-Monte Carlo scheme (M2CC), both for finding a good chain sequence and performing efficient inference. The M2CC algorithm remains tractable for high-dimensional data sets and obtains the best overall accuracy, as shown on several real data sets with input dimension as high as 1449 and up to 103 labels

    Efficient Monte Carlo methods for multi-dimensional learning with classifier chains

    Get PDF
    Multi-dimensional classification (MDC) is the supervised learning problem where an instance is associated with multiple classes, rather than with a single class, as in traditional classification problems. Since these classes are often strongly correlated, modeling the dependencies between them allows MDC methods to improve their performance – at the expense of an increased computational cost. In this paper we focus on the classifier chains (CC) approach for modeling dependencies, one of the most popular and highest-performing methods for multi-label classification (MLC), a particular case of MDC which involves only binary classes (i.e., labels). The original CC algorithm makes a greedy approximation, and is fast but tends to propagate errors along the chain. Here we present novel Monte Carlo schemes, both for finding a good chain sequence and performing efficient inference. Our algorithms remain tractable for high-dimensional data sets and obtain the best predictive performance across several real data sets

    Independent doubly Adaptive Rejection Metropolis Sampling

    Get PDF
    Adaptive Rejection Metropolis Sampling (ARMS) is a wellknown MCMC scheme for generating samples from onedimensional target distributions. ARMS is widely used within Gibbs sampling, where automatic and fast samplers are often needed to draw from univariate full-conditional densities. In this work, we propose an alternative adaptive algorithm (IA2RMS) that overcomes the main drawback of ARMS (an uncomplete adaptation of the proposal in some cases), speeding up the convergence of the chain to the target. Numerical results show that IA2RMS outperforms the standard ARMS, providing a correlation among samples close to zero

    Monte Carlo limit cycle characterization

    Get PDF
    The fixed point implementation of IIR digital filters usually leads to the appearance of zero-input limit cycles, which degrade the performance of the system. In this paper, we develop an efficient Monte Carlo algorithm to detect and characterize limit cycles in fixed-point IIR digital filters. The proposed approach considers filters formulated in the state space and is valid for any fixed point representation and quantization function. Numerical simulations on several high-order filters, where an exhaustive search is unfeasible, show the effectiveness of the proposed approach

    On the Generalized Ratio of Uniforms as a Combination of Transformed Rejection and Extended Inverse of Density Sampling

    Get PDF
    Documento depositado en el repositorio arXiv.org. VersiĂłn: arXiv:1205.0482v6 [stat.CO]In this work we investigate the relationship among three classical sampling techniques: the inverse of density (Khintchine's theorem), the transformed rejection (TR) and the generalized ratio of uniforms (GRoU). Given a monotonic probability density function (PDF), we show that the transformed area obtained using the generalized ratio of uniforms method can be found equivalently by applying the transformed rejection sampling approach to the inverse function of the target density. Then we provide an extension of the classical inverse of density idea, showing that it is completely equivalent to the GRoU method for monotonic densities. Although we concentrate on monotonic probability density functions (PDFs), we also discuss how the results presented here can be extended to any non-monotonic PDF that can be decomposed into a collection of intervals where it is monotonically increasing or decreasing. In this general case, we show the connections with transformations of certain random variables and the generalized inverse PDF with the GRoU technique. Finally, we also introduce a GRoU technique to handle unbounded target densities

    An adaptive population importance sampler

    Get PDF
    Monte Carlo (MC) methods are widely used in signal processing, machine learning and communications for statistical inference and stochastic optimization. A well-known class of MC methods is composed of importance sampling and its adaptive extensions (e.g., population Monte Carlo). In this work, we introduce an adaptive importance sampler using a population of proposal densities. The novel algorithm provides a global estimation of the variables of interest iteratively, using all the samples generated. The cloud of proposals is adapted by learning from a subset of previously generated samples, in such a way that local features of the target density can be better taken into account compared to single global adaptation procedures. Numerical results show the advantages of the proposed sampling scheme in terms of mean absolute error and robustness to initialization

    Grouped sparsity algorithm for multichannel intracardiac ECG synchronization

    Get PDF
    In this paper, a new method is presented to ensure automatic synchronization of intracardiac ECG data, yielding a three-stage algorithm. We first compute a robust estimate of the derivative of the data to remove low-frequency perturbations. Then we provide a grouped-sparse representation of the data, by means of the Group LASSO, to ensure that all the electrical spikes are simultaneously detected. Finally, a post-processing step, based on a variance analysis, is performed to discard false alarms. Preliminary results on real data for sinus rhythm and atrial fibrillation show the potential of this approach

    Almost rejectionless sampling from Nakagami-m distributions (m ≥ 1)

    Get PDF
    The Nakagami-m distribution is widely used for the simulation of fading channels in wireless communications. A novel, simple and extremely efficient acceptance-rejection algorithm is introduced for the generation of independent Nakagami-m random variables. The proposed method uses another Nakagami density with a half-integer value of the fading parameter, mp ¼ n/2 ≤ m, as proposal function, from which samples can be drawn exactly and easily. This novel rejection technique is able to work with arbitrary values of m ≥ 1, average path energy, V, and provides a higher acceptance rate than all currently available methods. RESUMEN. Método extremadamente eficiente para generar variables aleatorias de Nakagami (utilizadas para modelar el desvanecimiento en canales de comunicaciones móviles) basado en "rejection sampling"

    Causality analysis of atrial fibrillation electrograms

    Get PDF
    Proceeding of 2015 Computing in Cardiology Conference (CinC 2015), September 6-9, 2015, Nice, FranceMulti-channel intracardiac electrocardiograms (electrograms) are sequentially acquired during heart surgery performed on patients with sustained atrial fibrillation (AF) to guide radio frequency catheter ablation. These electrograms are used by cardiologists to determine candidate areas for ablation (e.g., areas corresponding to high dominant frequencies or complex electrograms). In this paper, we introduce a novel hierarchical causality analysis method for the multi-output sequentially acquired electrograms. The causal model obtained provides important information regarding delays among signals as well as the direction and strength of their causal connections. The tool developed may ultimately serve to guide cardiologists towards candidate areas for catheter ablation. Preliminary results on synthetic signals are used to validate the proposed approach.This work has been supported by the Spanish government’s projects ALCIT (TEC2012-38800-C03-01), AGES (S2010/BMD-2422), and OTOSiS (TEC2013-41718-R), and COMPREHENSION (TEC2012-38883-C02-01). D. Luengo has also been funded by the BBVA Foundation’s “I Convocatoria de Ayudas Fundación BBVA a Investigadores, Innovadores y Creadores Culturales”.Publicad
    • …
    corecore