3,311 research outputs found

    Clustering Via Nonparametric Density Estimation: the R Package pdfCluster

    Get PDF
    The R package pdfCluster performs cluster analysis based on a nonparametric estimate of the density of the observed variables. After summarizing the main aspects of the methodology, we describe the features and the usage of the package, and finally illustrate its working with the aid of two datasets

    Type-Based Termination, Inflationary Fixed-Points, and Mixed Inductive-Coinductive Types

    Full text link
    Type systems certify program properties in a compositional way. From a bigger program one can abstract out a part and certify the properties of the resulting abstract program by just using the type of the part that was abstracted away. Termination and productivity are non-trivial yet desired program properties, and several type systems have been put forward that guarantee termination, compositionally. These type systems are intimately connected to the definition of least and greatest fixed-points by ordinal iteration. While most type systems use conventional iteration, we consider inflationary iteration in this article. We demonstrate how this leads to a more principled type system, with recursion based on well-founded induction. The type system has a prototypical implementation, MiniAgda, and we show in particular how it certifies productivity of corecursive and mixed recursive-corecursive functions.Comment: In Proceedings FICS 2012, arXiv:1202.317

    A fast Monte-Carlo method with a Reduced Basis of Control Variates applied to Uncertainty Propagation and Bayesian Estimation

    Get PDF
    The Reduced-Basis Control-Variate Monte-Carlo method was introduced recently in [S. Boyaval and T. Leli\`evre, CMS, 8 2010] as an improved Monte-Carlo method, for the fast estimation of many parametrized expected values at many parameter values. We provide here a more complete analysis of the method including precise error estimates and convergence results. We also numerically demonstrate that it can be useful to some parametrized frameworks in Uncertainty Quantification, in particular (i) the case where the parametrized expectation is a scalar output of the solution to a Partial Differential Equation (PDE) with stochastic coefficients (an Uncertainty Propagation problem), and (ii) the case where the parametrized expectation is the Bayesian estimator of a scalar output in a similar PDE context. Moreover, in each case, a PDE has to be solved many times for many values of its coefficients. This is costly and we also use a reduced basis of PDE solutions like in [S. Boyaval, C. Le Bris, Nguyen C., Y. Maday and T. Patera, CMAME, 198 2009]. This is the first combination of various Reduced-Basis ideas to our knowledge, here with a view to reducing as much as possible the computational cost of a simple approach to Uncertainty Quantification

    Modeled and Measured Partially Coherent Illumination Speckle Effects from Sloped Surfaces for Tactical Tracking

    Get PDF
    The statistical properties of speckle relevant to short to medium range (tactical) active tracking involving polychromatic (partially temporally coherent) illumination are investigated. A numerical model is developed to allow rapid simulation of speckled images including the speckle contrast reduction effects of illuminator bandwidth, surface slope and roughness, and the polarization properties of both the source and the reflection. Regarding surface slope, Huntley\u27s theory for speckle contrast, which employs geometrical approximations to decrease computation time, is modified to increase accuracy by incorporation of a geometrical correction factor and better treatment of roughness and polarization. The resulting model shows excellent agreement with more exact theory over a wide range. An experiment is conducted to validate both the numerical model developed here and existing theory. A short coherence length diode laser source is reflected off of a silver-coated diffuse surface. Speckle data is gathered for 16 surface slope angles corresponding to speckle contrast between about 0.55 and 1. Taking Hu\u27s theory as truth, the measurements have -1.1% mean difference with 2.9% standard deviation, while the modified Huntley equation has 1.4% mean difference with 1.0% standard deviation. Thus, the theory is validated over the range of this experiment

    Observational Equivalence and Full Abstraction in the Symmetric Interaction Combinators

    Full text link
    The symmetric interaction combinators are an equally expressive variant of Lafont's interaction combinators. They are a graph-rewriting model of deterministic computation. We define two notions of observational equivalence for them, analogous to normal form and head normal form equivalence in the lambda-calculus. Then, we prove a full abstraction result for each of the two equivalences. This is obtained by interpreting nets as certain subsets of the Cantor space, called edifices, which play the same role as Boehm trees in the theory of the lambda-calculus
    • …
    corecore