205 research outputs found

    Semiparametric curve alignment and shift density estimation for biological data

    Full text link
    Assume that we observe a large number of curves, all of them with identical, although unknown, shape, but with a different random shift. The objective is to estimate the individual time shifts and their distribution. Such an objective appears in several biological applications like neuroscience or ECG signal processing, in which the estimation of the distribution of the elapsed time between repetitive pulses with a possibly low signal-noise ratio, and without a knowledge of the pulse shape is of interest. We suggest an M-estimator leading to a three-stage algorithm: we split our data set in blocks, on which the estimation of the shifts is done by minimizing a cost criterion based on a functional of the periodogram; the estimated shifts are then plugged into a standard density estimator. We show that under mild regularity assumptions the density estimate converges weakly to the true shift distribution. The theory is applied both to simulations and to alignment of real ECG signals. The estimator of the shift distribution performs well, even in the case of low signal-to-noise ratio, and is shown to outperform the standard methods for curve alignment.Comment: 30 pages ; v5 : minor changes and correction in the proof of Proposition 3.

    The formalisation of specifications from specifications written in natural language

    No full text
    pp. 369-374International audienceThe activity of specification is becoming considerable; every day an enormous quantity of pages which, for the most part, is written in natural language. For CNET, which carries out studies of the services and equipment of France Telecom and which has the capability of putting into practice the stages of specification and validation, the need to reduce the time needed in the development of the services is a priority. One method towards achieving this objective is to formalise the maximum number of specifications received. With this in mind, we will try to demonstrate the possibility of a certain automation in the passage from the informal to the formal, by means of methods and proven tools, available to assist an expert in specifications. For this end we propose a process of formalisation which relies on an intermediary representation of the specifications with the formalism of conceptual graphs before arriving at a formal description in Z of the initial spécification

    A method and a tool for geocoding and record linkage

    Get PDF
    For many years, researchers have presented the geocoding of postal addresses as a challenge. Several research works have been devoted to achieve the geocoding process. This paper presents theoretical and technical aspects for geolocalization, geocoding, and record linkage. It shows possibilities and limitations of existing methods and commercial software identifying areas for further research. In particular, we present a methodology and a computing tool allowing the correction and the geo-coding of mailing addresses. The paper presents two main steps of the methodology. The first preliminary step is addresses correction (addresses matching), while the second caries geocoding of identified addresses. Additionally, we present some results from the processing of real data sets. Finally, in the discussion, areas for further research are identified.addresses correction; geocodage; matching; data management; record linkage

    Souffrance rédemptrice dans le judaïsme ? Entrevue avec Shmuel Trigano

    Get PDF
    Cette entrevue avec Shmuel Trigano, directeur de Pardès , considère que ce n’est pas la souffrance comme telle qui a valeur en judaïsme, mais le témoignage de la foi. On trouve certes dans le judaïsme une mystique de la souffrance pour la sanctification du Nom ( Kidouch haChem ), mais elle n’y a pas le poids de la souffrance rédemptrice en christianisme. Et si la Bible attribue à Dieu quelque implication dans la violence, Trigano considère que, sur le plan historique, il faut chercher la responsabilité humaine et non la responsabilité divine. Ce qu’il fait devant la Shoah et devant la violence au Moyen-Orient, dont l’horrifient les interprétations religieuses qui en attribuent à Dieu le dessein ou la responsabilité.This interview with Shmuel Trigano, editor of Pardès , considers that it is not suffering as such that has value in Judaism, but the testimony of faith. One certainly finds in Judaism a mystique of suffering for the sanctification of the Name ( Kidouch haChem ), but it does not carry the same weight as redeeming suffering in Christianity. And while the Bible attributes to God some involvement in violence, Trigano considers that, on a historical level, it is necessary to look for human and not divine responsibility. So is he appalled by religious interpretations that assign to God intentionality or responsibility for the Shoah and violence in the Middle East

    Nonparametric inference of photon energy distribution from indirect measurements

    Get PDF
    International audienceWe consider a density estimation problem arising in nuclear physics. Gamma photons are impinging on a semiconductor detector, producing pulses of current. The integral of this pulse is equal to the total amount of charge created by the photon in the detector, which is linearly related to the photon energy. Because the inter-arrival of photons can be shorter than the charge collection time, pulses corresponding to different photons may overlap leading to a phenomenon known as pileup. The distortions on the photon energy spectrum estimate due to pileup become worse when the photon rate increases, making pileup correction techniques a must for high counting rate experiments. In this paper, we present a novel technique to correct pileup, which extends a method introduced in \cite{hall:park:2004} for the estimation of the service time from the busy period in M/G/\infty models. It is based on a novel formula linking the joint distribution of the energy and duration of the cluster of pulses and the distribution of the energy of the photons. We then assess the performance of this estimator by providing an expression of its integrated square error. A Monte-Carlo experiment is presented to illustrate on practical examples the benefits of the pileup correction

    Sparse regression algorithm for activity estimation in γ\gamma spectrometry

    Full text link
    We consider the counting rate estimation of an unknown radioactive source, which emits photons at times modeled by an homogeneous Poisson process. A spectrometer converts the energy of incoming photons into electrical pulses, whose number provides a rough estimate of the intensity of the Poisson process. When the activity of the source is high, a physical phenomenon known as pileup effect distorts direct measurements, resulting in a significant bias to the standard estimators of the source activities used so far in the field. We show in this paper that the problem of counting rate estimation can be interpreted as a sparse regression problem. We suggest a post-processed, non-negative, version of the Least Absolute Shrinkage and Selection Operator (LASSO) to estimate the photon arrival times. The main difficulty in this problem is that no theoretical conditions can guarantee consistency in sparsity of LASSO, because the dictionary is not ideal and the signal is sampled. We therefore derive theoretical conditions and bounds which illustrate that the proposed method can none the less provide a good, close to the best attainable, estimate of the counting rate activity. The good performances of the proposed approach are studied on simulations and real datasets

    RÉDACTION DE SPÉCIFICATIONS FORMELLES : ÉLABORATION À PARTIR DES SPÉCIFICATIONS ÉCRITES EN LANGAGE NATUREL

    No full text
    National audienceL'activité de spécification devient considérable ; une multitude de pages sont écrites tous les jours et la plupart du temps en langage naturel. Pour le CNET (Centre National d'Etudes des Télécommunications), qui réalise des études de services et d'équipements de France Telecom, et qui possède la maîtrise des étapes de spécification et de validation, a nécessité de réduire les temps de développement des services est une priorité. Une condition pour atteindre cet objectif consiste à formaliser le maximum de spécifications produites. Dans ce contexte, nous essayerons de montrer, la possibilité d'une certaine automatisation du passage de l'informel au formel, grâce à des méthodes et outils fiables, susceptibles d'assister un expert humain en spécifications. Nous proposons pour cela un processus de formalisation qui s'appuie sur une représentatin intermédiaire des spécifications avec le formalisme des graphes conceptuels, avant de dériver une description formelle en Z de la spécification initiale

    Fast Digital Filtering of Spectrometric Data for Pile-up Correction

    Get PDF
    International audienceThis paper considers a problem stemming from the analysis of spectrometric data. When performing experiments on highly radioactive matter, electrical pulses recorded by the spectrometer tend to overlap, thus yielding severe distortions when computing the histogram of the pulses' energies. In this paper, we propose a fast recursive algorithm which estimates efficiently this histogram from measurements of the duration and energies of overlapping pulses. Its good performances are shown both on simulations and real data. Furthermore, its lower algorithmic complexity makes it more fitting for real-time implementation
    corecore