39 research outputs found

    Amélioration des ouvertures par chemins pour l'analyse d'images à N dimensions et implémentations optimisées

    Get PDF
    La détection de structures fines et orientées dans une image peut mener à un très large champ d'applications en particulier dans le domaine de l'imagerie médicale, des sciences des matériaux ou de la télédétection. Les ouvertures et fermetures par chemins sont des opérateurs morphologiques utilisant des chemins orientés et flexibles en guise d'éléments structurants. Ils sont utilisés de la même manière que les opérateurs morphologiques utilisant des segments orientés comme éléments structurants mais sont plus efficaces lorsqu'il s'agit de détecter des structures pouvant être localement non rigides. Récemment, une nouvelle implémentation des opérateurs par chemins a été proposée leur permettant d'être appliqués à des images 2D et 3D de manière très efficace. Cependant, cette implémentation est limitée par le fait qu'elle n'est pas robuste au bruit affectant les structures fines. En effet, pour être efficaces, les opérateurs par chemins doivent être suffisamment longs pour pouvoir correspondre à la longueur des structures à détecter et deviennent de ce fait beaucoup plus sensibles au bruit de l'image. La première partie de ces travaux est dédiée à répondre à ce problème en proposant un algorithme robuste permettant de traiter des images 2D et 3D. Nous avons proposé les opérateurs par chemins robustes, utilisant une famille plus grande d'éléments structurants et qui, donnant une longueur L et un paramètre de robustesse G, vont permettre la propagation du chemin à travers des déconnexions plus petites ou égales à G, rendant le paramètre G indépendant de L. Cette simple proposition mènera à une implémentation plus efficace en terme de complexité de calculs et d'utilisation mémoire que l'état de l'art. Les opérateurs développés ont été comparés avec succès avec d'autres méthodes classiques de la détection des structures curvilinéaires de manière qualitative et quantitative. Ces nouveaux opérateurs ont été par la suite intégrés dans une chaîne complète de traitement d'images et de modélisation pour la caractérisation des matériaux composite renforcés avec des fibres de verres. Notre étude nous a ensuite amenés à nous intéresser à des filtres morphologiques récents basés sur la mesure de caractéristiques géodésiques. Ces filtres sont une bonne alternative aux ouvertures par chemins car ils sont très efficaces lorsqu'il s'agit de détecter des structures présentant de fortes tortuosités ce qui est précisément la limitation majeure des ouvertures par chemins. La combinaison de la robustesse locale des ouvertures par chemins robustes et la capacité des filtres par attributs géodésiques à recouvrer les structures tortueuses nous ont permis de proposer un nouvel algorithme, les ouvertures par chemins robustes et sélectives.The detection of thin and oriented features in an image leads to a large field of applications specifically in medical imaging, material science or remote sensing. Path openings and closings are efficient morphological operators that use flexible oriented paths as structuring elements. They are employed in a similar way to operators with rotated line segments as structuring elements, but are more effective as they can detect linear structures that are not necessarily locally perfectly straight. While their theory has always allowed paths in arbitrary dimensions, de facto implementations were only proposed in 2D. Recently, a new implementation was proposed enabling the computation of efficient d-dimensional path operators. However this implementation is limited in the sense that it is not robust to noise. Indeed, in practical applications, for path operators to be effective, structuring elements must be sufficiently long so that they correspond to the length of the desired features to be detected. Yet, path operators are increasingly sensitive to noise as their length parameter L increases. The first part of this work is dedicated to cope with this limitation. Thus, we will propose an efficient d-dimensional algorithm, the robust path operators, which use a larger family of flexible structuring elements. Given an arbitrary length parameter G, path propagation is allowed if disconnections between two pixels belonging to a path is less or equal to G and so, render it independent of L. This simple assumption leads to a constant memory bookkeeping and results in a low complexity. The developed operators have been compared qualitatively and quantitatively to other efficient methods for the detection of line-like features. As an application, robust path openings have been integrated into a complete chain of image processing for the modelling and the characterization of glass fibers reinforced polymer. Our study has also led us to focus our interest on recent morphological connected filters based on geodesic measurements. These filters are a good alternative to path operators as they are efficient at detecting the so-called "tortuous" shapes in an image which is precisely the main limitation of path operators. Combining the local robustness of the robust path operators with the ability of geodesic attribute-based filters to recover "tortuous" shapes have enabled us to propose another original algorithm, the selective and robust path operators.SAVOIE-SCD - Bib.électronique (730659901) / SudocGRENOBLE1/INP-Bib.électronique (384210012) / SudocGRENOBLE2/3-Bib.électronique (384219901) / SudocSudocFranceF

    Hermite Snakes With Control of Tangents

    Full text link

    Amélioration des ouvertures par chemins pour l'analyse d'images à N dimensions et implémentations optimisées

    No full text
    The detection of thin and oriented features in an image leads to a large field of applications specifically in medical imaging, material science or remote sensing. Path openings and closings are efficient morphological operators that use flexible oriented paths as structuring elements. They are employed in a similar way to operators with rotated line segments as structuring elements, but are more effective as they can detect linear structures that are not necessarily locally perfectly straight. While their theory has always allowed paths in arbitrary dimensions, de facto implementations were only proposed in 2D. Recently, a new implementation was proposed enabling the computation of efficient d-dimensional path operators. However this implementation is limited in the sense that it is not robust to noise. Indeed, in practical applications, for path operators to be effective, structuring elements must be sufficiently long so that they correspond to the length of the desired features to be detected. Yet, path operators are increasingly sensitive to noise as their length parameter L increases. The first part of this work is dedicated to cope with this limitation. Thus, we will propose an efficient d-dimensional algorithm, the robust path operators, which use a larger family of flexible structuring elements. Given an arbitrary length parameter G, path propagation is allowed if disconnections between two pixels belonging to a path is less or equal to G and so, render it independent of L. This simple assumption leads to a constant memory bookkeeping and results in a low complexity. The developed operators have been compared qualitatively and quantitatively to other efficient methods for the detection of line-like features. As an application, robust path openings have been integrated into a complete chain of image processing for the modelling and the characterization of glass fibers reinforced polymer. Our study has also led us to focus our interest on recent morphological connected filters based on geodesic measurements. These filters are a good alternative to path operators as they are efficient at detecting the so-called "tortuous" shapes in an image which is precisely the main limitation of path operators. Combining the local robustness of the robust path operators with the ability of geodesic attribute-based filters to recover "tortuous" shapes have enabled us to propose another original algorithm, the selective and robust path operators.La détection de structures fines et orientées dans une image peut mener à un très large champ d'applications en particulier dans le domaine de l'imagerie médicale, des sciences des matériaux ou de la télédétection. Les ouvertures et fermetures par chemins sont des opérateurs morphologiques utilisant des chemins orientés et flexibles en guise d'éléments structurants. Ils sont utilisés de la même manière que les opérateurs morphologiques utilisant des segments orientés comme éléments structurants mais sont plus efficaces lorsqu'il s'agit de détecter des structures pouvant être localement non rigides. Récemment, une nouvelle implémentation des opérateurs par chemins a été proposée leur permettant d'être appliqués à des images 2D et 3D de manière très efficace. Cependant, cette implémentation est limitée par le fait qu'elle n'est pas robuste au bruit affectant les structures fines. En effet, pour être efficaces, les opérateurs par chemins doivent être suffisamment longs pour pouvoir correspondre à la longueur des structures à détecter et deviennent de ce fait beaucoup plus sensibles au bruit de l'image. La première partie de ces travaux est dédiée à répondre à ce problème en proposant un algorithme robuste permettant de traiter des images 2D et 3D. Nous avons proposé les opérateurs par chemins robustes, utilisant une famille plus grande d'éléments structurants et qui, donnant une longueur L et un paramètre de robustesse G, vont permettre la propagation du chemin à travers des déconnexions plus petites ou égales à G, rendant le paramètre G indépendant de L. Cette simple proposition mènera à une implémentation plus efficace en terme de complexité de calculs et d'utilisation mémoire que l'état de l'art. Les opérateurs développés ont été comparés avec succès avec d'autres méthodes classiques de la détection des structures curvilinéaires de manière qualitative et quantitative. Ces nouveaux opérateurs ont été par la suite intégrés dans une chaîne complète de traitement d'images et de modélisation pour la caractérisation des matériaux composite renforcés avec des fibres de verres. Notre étude nous a ensuite amenés à nous intéresser à des filtres morphologiques récents basés sur la mesure de caractéristiques géodésiques. Ces filtres sont une bonne alternative aux ouvertures par chemins car ils sont très efficaces lorsqu'il s'agit de détecter des structures présentant de fortes tortuosités ce qui est précisément la limitation majeure des ouvertures par chemins. La combinaison de la robustesse locale des ouvertures par chemins robustes et la capacité des filtres par attributs géodésiques à recouvrer les structures tortueuses nous ont permis de proposer un nouvel algorithme, les ouvertures par chemins robustes et sélectives

    PHYSTAT-LHC Workshop on Statistical Issues for LHC Physics

    Get PDF
    A PHYSTAT workshop on the topic of Statistical issues for LHC physics was held at CERN. The workshop focused on issues related to discovery that we hope will be relevant to the LHC. These proceedings contain written versions of nearly all the talks, several of which were given by professional statisticians. The talks varied from general overviews, to those describing searches for specific particles. The treatment of background uncertainties figured prominently. Many of the talks describing search strategies for new effects should be of interest not only to particle physicists but also to scientists in other fields

    Thin Viscous Films on Curved Geometries

    Get PDF
    The topic of this thesis is the evolution of thin viscous films on curved substrates. Using techniques from differential geometry, namely the exterior calculus of differential forms, and from optimization theory, in particular the theory of saddle point problems and the shape calculus, we reduce a variational form of the Stoke equations, which govern the flow, to a two dimensional optimization problem with a PDE constraint on the substrate. This reduction is analogous to the lubrication approximation of the classic thin film equation. We study the well-posedness of a, suitably regularised, version of this reduced model of the flow, using variational techniques. Furthermore, we study the well-posedness and convergence of time- and space-discrete versions of the model. The time discretization is based on the idea of the natural time discretization of a gradient flow, whereas the spatial discretization is done via suitably chosen finite element spaces. Finally, we present a particular implementation of the discrete scheme on subdivision surfaces, together with relevant numerical results

    Biological image analysis

    Get PDF
    In biological research images are extensively used to monitor growth, dynamics and changes in biological specimen, such as cells or plants. Many of these images are used solely for observation or are manually annotated by an expert. In this dissertation we discuss several methods to automate the annotating and analysis of bio-images. Two large clusters of methods have been investigated and developed. A first set of methods focuses on the automatic delineation of relevant objects in bio-images, such as individual cells in microscopic images. Since these methods should be useful for many different applications, e.g. to detect and delineate different objects (cells, plants, leafs, ...) in different types of images (different types of microscopes, regular colour photographs, ...), the methods should be easy to adjust. Therefore we developed a methodology relying on probability theory, where all required parameters can easily be estimated by a biologist, without requiring any knowledge on the techniques used in the actual software. A second cluster of investigated techniques focuses on the analysis of shapes. By defining new features that describe shapes, we are able to automatically classify shapes, retrieve similar shapes from a database and even analyse how an object deforms through time

    Doctor of Philosophy

    Get PDF
    dissertationShape analysis is a well-established tool for processing surfaces. It is often a first step in performing tasks such as segmentation, symmetry detection, and finding correspondences between shapes. Shape analysis is traditionally employed on well-sampled surfaces where the geometry and topology is precisely known. When the form of the surface is that of a point cloud containing nonuniform sampling, noise, and incomplete measurements, traditional shape analysis methods perform poorly. Although one may first perform reconstruction on such a point cloud prior to performing shape analysis, if the geometry and topology is far from the true surface, then this can have an adverse impact on the subsequent analysis. Furthermore, for triangulated surfaces containing noise, thin sheets, and poorly shaped triangles, existing shape analysis methods can be highly unstable. This thesis explores methods of shape analysis applied directly to such defect-laden shapes. We first study the problem of surface reconstruction, in order to obtain a better understanding of the types of point clouds for which reconstruction methods contain difficulties. To this end, we have devised a benchmark for surface reconstruction, establishing a standard for measuring error in reconstruction. We then develop a new method for consistently orienting normals of such challenging point clouds by using a collection of harmonic functions, intrinsically defined on the point cloud. Next, we develop a new shape analysis tool which is tolerant to imperfections, by constructing distances directly on the point cloud defined as the likelihood of two points belonging to a mutually common medial ball, and apply this for segmentation and reconstruction. We extend this distance measure to define a diffusion process on the point cloud, tolerant to missing data, which is used for the purposes of matching incomplete shapes undergoing a nonrigid deformation. Lastly, we have developed an intrinsic method for multiresolution remeshing of a poor-quality triangulated surface via spectral bisection

    Combinatorial curve reconstruction and the efficient exact implementation of geometric algorithms

    Get PDF
    This thesis has two main parts. The first part deals with the problem of curve reconstruction. Given a finite sample set S from an unknown collection of curves Gamma, the task is to compute the graph G(S, Gamma) which has vertex set S and an edge between exactly those pairs of vertices that are adjacent on some curve in Gamma. We present a purely combinatorial algorithm that solves the curve reconstruction problem in polynomial time. It is the first algorithm which provably handles collections of curves with corners and endpoints. In the second part of this thesis, we will be concerned with the exact and efficient im plementation of geometric algorithms. First, we develop a generalized filtering scheme to speed-up exact geometric computation and then discuss the design of an object-oriented kernel for geometric computation.Diese Dissertation besteht aus zwei Teilen. Der erste Teil beschäftigt sich mit den Problemen der Kurvenrekonstruktion. Gegeben eine endliche Menge von Stichprobenpunkten S von einer Menge von unbekannten Kurven Gamma, besteht die Aufgabe darin, den Graphen G(S, Gamma) zu konstruieren, welcher die Knotenmenge S und Kanten zwischen genau den Knotenpaaren besitzt, welche auf einer der Kurven in Gamma adjazent sind. Wir präsentieren einen rein kombinatorischen Algorithmus, der das Kurevenkonstruktionsproblem in polynomieller Zeit löst. Es ist der erste Algorithmus, der beweisbar Mengen von Kurven rekonstruieren kann, wenn diese auch Ecken und Endpunkte beinhalten dürfen. Der zweite Teil dieser Dissertation handelt von der exakten und effizienten Implementierung von Geometrischen Algorithmen. Wir entwickeln zunächst ein generalisiertes Filterschema, um exakte geometrische Berechnungen zu beschleunigen, und entwerfen dann das Design eines objektorientierten Kernels für geometrische Berechnungen

    Bayesian Gaussian Process Models: PAC-Bayesian Generalisation Error Bounds and Sparse Approximations

    Get PDF
    Non-parametric models and techniques enjoy a growing popularity in the field of machine learning, and among these Bayesian inference for Gaussian process (GP) models has recently received significant attention. We feel that GP priors should be part of the standard toolbox for constructing models relevant to machine learning in the same way as parametric linear models are, and the results in this thesis help to remove some obstacles on the way towards this goal. In the first main chapter, we provide a distribution-free finite sample bound on the difference between generalisation and empirical (training) error for GP classification methods. While the general theorem (the PAC-Bayesian bound) is not new, we give a much simplified and somewhat generalised derivation and point out the underlying core technique (convex duality) explicitly. Furthermore, the application to GP models is novel (to our knowledge). A central feature of this bound is that its quality depends crucially on task knowledge being encoded faithfully in the model and prior distributions, so there is a mutual benefit between a sharp theoretical guarantee and empirically well-established statistical practices. Extensive simulations on real-world classification tasks indicate an impressive tightness of the bound, in spite of the fact that many previous bounds for related kernel machines fail to give non-trivial guarantees in this practically relevant regime. In the second main chapter, sparse approximations are developed to address the problem of the unfavourable scaling of most GP techniques with large training sets. Due to its high importance in practice, this problem has received a lot of attention recently. We demonstrate the tractability and usefulness of simple greedy forward selection with information-theoretic criteria previously used in active learning (or sequential design) and develop generic schemes for automatic model selection with many (hyper)parameters. We suggest two new generic schemes and evaluate some of their variants on large real-world classification and regression tasks. These schemes and their underlying principles (which are clearly stated and analysed) can be applied to obtain sparse approximations for a wide regime of GP models far beyond the special cases we studied here
    corecore