256 research outputs found

    Unsupervised Polygonal Reconstruction of Noisy Contours by a Discrete Irregular Approach

    Get PDF
    International audienceIn this paper, we present an original algorithm to build a polygonal reconstruction of noisy digital contours. For this purpose, we first improve an algorithm devoted to the vectorization of discrete irregular isothetic objects. Afterwards we propose to use it to define a reconstruction process of noisy digital contours. More precisely, we use a local noise detector, introduced by Kerautret and Lachaud in IWCIA 2009, that builds a multi-scale representation of the digital contour, which is composed of pixels of various size depending of the local amount of noise. Finally, we compare our approach with previous works, by con- sidering the Hausdorff distance and the error on tangent orientations of the computed line segments to the original perfect contour. Thanks to both synthetic and real noisy objects, we show that our approach has interesting performance, and could be applied in document analysis systems

    Local non-planarity of three dimensional surfaces for an invertible reconstruction: k-cuspal cells

    No full text
    International audienceThis paper addresses the problem of the maximal recognition of hyperplanes for an invertible reconstruction of 3D discrete objects. k- cuspal cells are introduced as a three dimensional extension of discrete cusps defined by R.Breton. With k-cuspal cells local non planarity on discrete surfaces can be identified in a very straightforward way

    Two discrete-continuous operations based on the scaling transform

    No full text
    International audienceIn this paper we study the relationship between the Euclidean and the discrete world thru two operations based on the Euclidean scaling function: the discrete smooth scaling and the discrete based geometrical simplification

    Accurate Cardinality Estimation of Co-occurring Words Using Suffix Trees (Extended Version)

    Get PDF
    Estimating the cost of a query plan is one of the hardest problems in query optimization. This includes cardinality estimates of string search patterns, of multi-word strings like phrases or text snippets in particular. At first sight, suffix trees address this problem. To curb the memory usage of a suffix tree, one often prunes the tree to a certain depth. But this pruning method "takes away" more information from long strings than from short ones. This problem is particularly severe with sets of long strings, the setting studied here. In this article, we propose respective pruning techniques. Our approaches remove characters with low information value. The various variants determine a character\u27s information value in different ways, e.g., by using conditional entropy with respect to previous characters in the string. Our experiments show that, in contrast to the well-known pruned suffix tree, our technique provides significantly better estimations when the tree size is reduced by 60% or less. Due to the redundancy of natural language, our pruning techniques yield hardly any error for tree-size reductions of up to 50%

    Principles of Neural Network Architecture Design - Invertibility and Domain Knowledge

    Get PDF
    Neural networks architectures allow a tremendous variety of design choices. In this work, we study two principles underlying these architectures: First, the design and application of invertible neural networks (INNs). Second, the incorporation of domain knowledge into neural network architectures. After introducing the mathematical foundations of deep learning, we address the invertibility of standard feedforward neural networks from a mathematical perspective. These results serve as a motivation for our proposed invertible residual networks (i-ResNets). This architecture class is then studied in two scenarios: First, we propose ways to use i-ResNets as a normalizing flow and demonstrate the applicability for high-dimensional generative modeling. Second, we study the excessive invariance of common deep image classifiers and discuss consequences for adversarial robustness. We finish with a study of convolutional neural networks for tumor classification based on imaging mass spectrometry (IMS) data. For this application, we propose an adapted architecture guided by our knowledge of the domain of IMS data and show its superior performance on two challenging tumor classification datasets

    A Posteriori Error Control for the Binary Mumford-Shah Model

    Full text link
    The binary Mumford-Shah model is a widespread tool for image segmentation and can be considered as a basic model in shape optimization with a broad range of applications in computer vision, ranging from basic segmentation and labeling to object reconstruction. This paper presents robust a posteriori error estimates for a natural error quantity, namely the area of the non properly segmented region. To this end, a suitable strictly convex and non-constrained relaxation of the originally non-convex functional is investigated and Repin's functional approach for a posteriori error estimation is used to control the numerical error for the relaxed problem in the L2L^2-norm. In combination with a suitable cut out argument, a fully practical estimate for the area mismatch is derived. This estimate is incorporated in an adaptive meshing strategy. Two different adaptive primal-dual finite element schemes, and the most frequently used finite difference discretization are investigated and compared. Numerical experiments show qualitative and quantitative properties of the estimates and demonstrate their usefulness in practical applications.Comment: 18 pages, 7 figures, 1 tabl
    • …
    corecore