512 research outputs found

    Graph reconstruction from the observation of diffused signals

    Full text link
    Signal processing on graphs has received a lot of attention in the recent years. A lot of techniques have arised, inspired by classical signal processing ones, to allow studying signals on any kind of graph. A common aspect of these technique is that they require a graph correctly modeling the studied support to explain the signals that are observed on it. However, in many cases, such a graph is unavailable or has no real physical existence. An example of this latter case is a set of sensors randomly thrown in a field which obviously observe related information. To study such signals, there is no intuitive choice for a support graph. In this document, we address the problem of inferring a graph structure from the observation of signals, under the assumption that they were issued of the diffusion of initially i.i.d. signals. To validate our approach, we design an experimental protocol, in which we diffuse signals on a known graph. Then, we forget the graph, and show that we are able to retrieve it very precisely from the only knowledge of the diffused signals.Comment: Allerton 2015 : 53th Annual Allerton Conference on Communication, Control and Computing, 30 september - 02 october 2015, Allerton, United States, 201

    Characterization and Inference of Graph Diffusion Processes from Observations of Stationary Signals

    Full text link
    Many tools from the field of graph signal processing exploit knowledge of the underlying graph's structure (e.g., as encoded in the Laplacian matrix) to process signals on the graph. Therefore, in the case when no graph is available, graph signal processing tools cannot be used anymore. Researchers have proposed approaches to infer a graph topology from observations of signals on its nodes. Since the problem is ill-posed, these approaches make assumptions, such as smoothness of the signals on the graph, or sparsity priors. In this paper, we propose a characterization of the space of valid graphs, in the sense that they can explain stationary signals. To simplify the exposition in this paper, we focus here on the case where signals were i.i.d. at some point back in time and were observed after diffusion on a graph. We show that the set of graphs verifying this assumption has a strong connection with the eigenvectors of the covariance matrix, and forms a convex set. Along with a theoretical study in which these eigenvectors are assumed to be known, we consider the practical case when the observations are noisy, and experimentally observe how fast the set of valid graphs converges to the set obtained when the exact eigenvectors are known, as the number of observations grows. To illustrate how this characterization can be used for graph recovery, we present two methods for selecting a particular point in this set under chosen criteria, namely graph simplicity and sparsity. Additionally, we introduce a measure to evaluate how much a graph is adapted to signals under a stationarity assumption. Finally, we evaluate how state-of-the-art methods relate to this framework through experiments on a dataset of temperatures.Comment: Submitted to IEEE Transactions on Signal and Information Processing over Network

    Detection threshold for non-parametric estimation

    No full text
    International audienceA new threshold is presented for better estimating a signal by sparse transform and soft thresholding. This threshold derives from a non-parametric statistical approach dedicated to the detection of a signal with unknown distribution and unknown probability of presence in independent and additive white Gaussian noise. This threshold is called the detection threshold and is particularly appropriate for selecting the few observations, provided by the sparse transform, whose amplitudes are sufficiently large to consider that they contain information about the signal. An upper bound for the risk of the soft thresholding estimation is computed when the detection threshold is used. For a wide class of signals, it is shown that, when the number of observations is large, this upper bound is from about twice to four times smaller than the standard upper bounds given for the universal and the minimax thresholds. Many real-world signals belong to this class, as illustrated by several experimental results

    Wavelet Packets of fractional Brownian motion: Asymptotic Analysis and Spectrum Estimation

    No full text
    International audienceThis work provides asymptotic properties of the autocorrelation functions of the wavelet packet coefficients of a fractional Brownian motion. It also discusses the convergence speed to the limit autocorrelation function, when the input random process is either a fractional Brownian motion or a wide-sense stationary second-order random process. The analysis concerns some families of wavelet paraunitary filters that converge almost everywhere to the Shannon paraunitary filters. From this analysis, we derive wavelet packet based spectrum estimation for fractional Brownian motions and wide-sense stationary random processes. Experimental tests show good results for estimating the spectrum of 1/f processes

    Smooth Adaptation by Sigmoid Shrinkage

    No full text
    International audienceThis work addresses the properties of a sub-class of sigmoid based shrinkage functions: the non zero-forcing smooth sigmoid based shrinkage functions or SigShrink functions. It provides a SURE optimization for the parameters of the SigShrink functions. The optimization is performed on an unbiased estimation risk obtained by using the functions of this sub-class. The SURE SigShrink performance measurements are compared to those of the SURELET (SURE linear expansion of thresholds) parameterization. It is shown that the SURE SigShrink performs well in comparison to the SURELET parameterization. The relevance of SigShrink is the physical meaning and the flexibility of its parameters. The SigShrink functions perform weak attenuation of data with large amplitudes and stronger attenuation of data with small amplitudes, the shrinkage process introducing little variability among data with close amplitudes. In the wavelet domain, SigShrink is particularly suitable for reducing noise without impacting significantly the signal to recover. A remarkable property for this class of sigmoid based functions is the invertibility of its elements. This property makes it possible to smoothly tune contrast (enhancement - reduction)

    Wavelet Shrinkage: Unification of Basic Thresholding Functions and Thresholds

    No full text
    International audienceThis work addresses the unification of some basic functions and thresholds used in non-parametric estimation of signals by shrinkage in the wavelet domain. The Soft and Hard thresholding functions are presented as degenerate \emph{smooth sigmoid based shrinkage} functions. The shrinkage achieved by this new family of sigmoid based functions is then shown to be equivalent to a regularisation of wavelet coefficients associated with a class of penalty functions. Some sigmoid based penalty functions are calculated, and their properties are discussed. The unification also concerns the universal and the minimax thresholds used to calibrate standard Soft and Hard thresholding functions: these thresholds pertain to a wide class of thresholds, called the detection thresholds. These thresholds depend on two parameters describing the sparsity degree for the wavelet representation of a signal. It is also shown that the non-degenerate sigmoid shrinkage adjusted with the new detection thresholds is as performant as the best up-to-date parametric and computationally expensive method. This justifies the relevance of sigmoid shrinkage for noise reduction in large databases or large size images

    Vers une caractérisation de la courbe d'incertitude pour des graphes portant des signaux

    No full text
    National audienceLe traitement de signal sur graphes est un domaine récent visant à généraliser les outils classiques du traitement de signal, afin d'analyser des signaux évoluant sur des domaines complexes. Ces domaines sont représentés par des graphes pour lesquels on peut calculer une matrice appelée Laplacien normalisé. Il a été montré que les valeurs propres de ce Laplacien correspondent aux fréquences du domaine de Fourier en traitement de signal classique. Ainsi, le domaine fréquentiel n'est pas identique pour tout graphe support des signaux. Une conséquence est qu'il n'y a pas de généralisation non triviale du principe d'incertitude d'Heisenberg, indiquant qu'un signal ne peut être à la fois localisé dans le domaine temporel et dans le domaine fréquentiel. Une manière de généraliser ce principe, introduite par Agaskar & Lu, consiste à déterminer une courbe servant de borne inférieure au compromis entre précision dans le domaine du graphe et précision dans le domaine spectral. L'objectif de ce papier est de proposer une caractérisation des signaux atteignant cette courbe, pour une classe de graphes plus générique que celle étudiée par Agaskar & Lu

    La Compétitivité passe aussi par la fiscalité:Nos idées pour adapter la loi de finances 2013 au pacte de compétitivité

    Get PDF
    Fruit d’une discussion collective, cette note présente les propositions de la Fondation pour réajuster les dispositions de la loi de finances 2013 afin de les rendre cohérentes avec les recommandations du pacte de compétitivité. Ce texte est issu d’une conversation entre Aldo Cardoso, membre du Conseil de surveillance de la Fondation pour l’innovation politique, Michel Didier, Professeur honoraire au CNAM et président de Coe-Rexecode, Bertrand Jacquillat, Professeur à Sciences Po et président d’Associés en Finance, Dominique Reynié et Grégoire Sentilhes, président de NextStage, des Journées de l’Entrepreneur et du G20 YES en France

    [Quality of anonymous delivery procedures: analysis of practices and ethical issues.]

    Get PDF
    OBJECTIVES: The purpose of this study was to assess implementation of procedures for anonymous delivery and also to determine the awareness of the medical team. MATERIAL: and method. We reviewed retrospectively all deliveries performed in a Paris maternity ward during the years 2000-2003 where the mother requested application of the anonymous procedure. A questionnaire was also addressed to all physicians and midwives in the same institution in order to evaluate their knowledge of the procedures available and their point of view. RESULTS: Among the seventeen deliveries examined, the anonymous procedure was not completely fulfilled for 11 since the name of the mother could be identified. The quality of the files depended on the date at which the decision to use the anonymous procedure was made: delivery was more anonymous when the decision was made at the first consultation, less so when made later. The questionnaires revealed that professionals lacked information and were insufficiently aware of the procedures available. CONCLUSION: It appears useful to establish a standard procedure in order to better protect the parturient's wishes and comply with French law (4 March 2002). This point is particularly important since at the infant's majority, he/she may request access to personal information contained in the medical file
    • …
    corecore