2,106 research outputs found

    Gradient extraction operators for discrete interval-valued data

    Get PDF
    Digital images are generally created as discrete measurements of light, as performed by dedicated sensors. Consequently, each pixel contains a discrete approximation of the light inciding in a sensor element. The nature of this measurement implies certain uncertainty due to discretization matters. In this work we propose to model such uncertainty using intervals, further leading to the generation of so-called interval-valued images. Then, we study the partial differentiation of such images, putting a spotlight on antisymmetric convolution operators for such task. Finally, we illustrate the utility of the interval-valued images by studying the behaviour of an extended version of the well-known Canny edges detection method

    Proceedings of the XIII Global Optimization Workshop: GOW'16

    Get PDF
    [Excerpt] Preface: Past Global Optimization Workshop shave been held in Sopron (1985 and 1990), Szeged (WGO, 1995), Florence (GO’99, 1999), Hanmer Springs (Let’s GO, 2001), Santorini (Frontiers in GO, 2003), San JosĂ© (Go’05, 2005), Mykonos (AGO’07, 2007), Skukuza (SAGO’08, 2008), Toulouse (TOGO’10, 2010), Natal (NAGO’12, 2012) and MĂĄlaga (MAGO’14, 2014) with the aim of stimulating discussion between senior and junior researchers on the topic of Global Optimization. In 2016, the XIII Global Optimization Workshop (GOW’16) takes place in Braga and is organized by three researchers from the University of Minho. Two of them belong to the Systems Engineering and Operational Research Group from the Algoritmi Research Centre and the other to the Statistics, Applied Probability and Operational Research Group from the Centre of Mathematics. The event received more than 50 submissions from 15 countries from Europe, South America and North America. We want to express our gratitude to the invited speaker Panos Pardalos for accepting the invitation and sharing his expertise, helping us to meet the workshop objectives. GOW’16 would not have been possible without the valuable contribution from the authors and the International ScientiïŹc Committee members. We thank you all. This proceedings book intends to present an overview of the topics that will be addressed in the workshop with the goal of contributing to interesting and fruitful discussions between the authors and participants. After the event, high quality papers can be submitted to a special issue of the Journal of Global Optimization dedicated to the workshop. [...

    3D coding tools final report

    Get PDF
    Livrable D4.3 du projet ANR PERSEECe rapport a été réalisé dans le cadre du projet ANR PERSEE (n° ANR-09-BLAN-0170). Exactement il correspond au livrable D4.3 du projet. Son titre : 3D coding tools final repor

    Proceedings of Abstracts, School of Physics, Engineering and Computer Science Research Conference 2022

    Get PDF
    © 2022 The Author(s). This is an open-access work distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. For further details please see https://creativecommons.org/licenses/by/4.0/. Plenary by Prof. Timothy Foat, ‘Indoor dispersion at Dstl and its recent application to COVID-19 transmission’ is © Crown copyright (2022), Dstl. This material is licensed under the terms of the Open Government Licence except where otherwise stated. To view this licence, visit http://www.nationalarchives.gov.uk/doc/open-government-licence/version/3 or write to the Information Policy Team, The National Archives, Kew, London TW9 4DU, or email: [email protected] present proceedings record the abstracts submitted and accepted for presentation at SPECS 2022, the second edition of the School of Physics, Engineering and Computer Science Research Conference that took place online, the 12th April 2022

    Object-based video representations: shape compression and object segmentation

    Get PDF
    Object-based video representations are considered to be useful for easing the process of multimedia content production and enhancing user interactivity in multimedia productions. Object-based video presents several new technical challenges, however. Firstly, as with conventional video representations, compression of the video data is a requirement. For object-based representations, it is necessary to compress the shape of each video object as it moves in time. This amounts to the compression of moving binary images. This is achieved by the use of a technique called context-based arithmetic encoding. The technique is utilised by applying it to rectangular pixel blocks and as such it is consistent with the standard tools of video compression. The blockbased application also facilitates well the exploitation of temporal redundancy in the sequence of binary shapes. For the first time, context-based arithmetic encoding is used in conjunction with motion compensation to provide inter-frame compression. The method, described in this thesis, has been thoroughly tested throughout the MPEG-4 core experiment process and due to favourable results, it has been adopted as part of the MPEG-4 video standard. The second challenge lies in the acquisition of the video objects. Under normal conditions, a video sequence is captured as a sequence of frames and there is no inherent information about what objects are in the sequence, not to mention information relating to the shape of each object. Some means for segmenting semantic objects from general video sequences is required. For this purpose, several image analysis tools may be of help and in particular, it is believed that video object tracking algorithms will be important. A new tracking algorithm is developed based on piecewise polynomial motion representations and statistical estimation tools, e.g. the expectationmaximisation method and the minimum description length principle

    Gradient extraction operators for discrete interval-valued data

    Get PDF
    Digital images are generally created as discrete measurements of light, as performed by dedicated sensors. Consequently, each pixel contains a discrete approximation of the light inciding in a sensor element. The nature of this measurement implies certain uncertainty due to discretization matters. In this work we propose to model such uncertainty using intervals, further leading to the generation of so-called interval-valued images. Then, we study the partial differentiation of such images, putting a spotlight on antisymmetric convolution operators for such task. Finally, we illustrate the utility of the interval-valued images by studying the behaviour of an extended version of the well-known Canny edges detection method

    Proceedings for the ICASE Workshop on Heterogeneous Boundary Conditions

    Get PDF
    Domain Decomposition is a complex problem with many interesting aspects. The choice of decomposition can be made based on many different criteria, and the choice of interface of internal boundary conditions are numerous. The various regions under study may have different dynamical balances, indicating that different physical processes are dominating the flow in these regions. This conference was called in recognition of the need to more clearly define the nature of these complex problems. This proceedings is a collection of the presentations and the discussion groups

    Surface Reconstruction From 3D Point Clouds

    Get PDF
    The triangulation of a point cloud of a 3D object is a complex problem, since it depends on the complexity of the shape of such object, as well as on the density of points generated by a specific scanner. In the literature, there are essentially two approaches to the reconstruction of surfaces from point clouds: interpolation and approximation. In general, interpolation approaches are associated with simplicial methods; that is, methods that directly generate a triangle mesh from a point cloud. On the other hand, approximation approaches generate a global implicit function — that represents an implicit surface — from local shape functions, then generating a triangulation of such implicit surface. The simplicial methods are divided into two families: Delaunay and mesh growing. Bearing in mind that the first of the methods presented in this dissertation falls under the category of mesh growing methods, let us focus our attention for now on these methods. One of the biggest problems with these methods is that, in general, they are based on the establishment of dihedral angle bounds between adjacent triangles, as needed to make the decision on which triangle to add to the expansion mesh front. Typically, other bounds are also used for the internal angles of each triangle. In the course of this dissertation, we will see how this problem was solved. The second algorithm introduced in this dissertation is also a simplicial method but does not fit into any of the two families mentioned above, which makes us think that we are in the presence of a new family: triangulation based on the atlas of charts or triangle stars. This algorithm generates an atlas of the surface that consists of overlapping stars of triangles, that is, one produces a total surface coverage, thus solving one of the common problems of this family of direct triangulation methods, which is the appearance of holes or incomplete triangulation of the surface. The third algorithm refers to an implicit method, but, unlike other implicit methods, it uses an interpolation approach. That is, the local shape functions interpolate the points of the cloud. It is, perhaps, one of a few implicit methods that we can find in the literature that interpolates all points of the cloud. Therefore, one of the biggest problems of the implicit methods is solved, which has to do with the smoothing of the surface sharp features resulting from the blending of the local functions into the global function. What is common to the three methods is the interpolation approach, either in simple or implicit methods, that is, the linearization of the surface subject to reconstruction. As will be seen, the linearization of the neighborhood of each point allows us to solve several problems posed to the surface reconstruction algorithms, namely: point sub‐sampling, non‐uniform sampling, as well as sharp features.A triangulação de uma nuvem de pontos de um objeto 3D Ă© um problema complexo, uma vez que depende da complexidade da forma desse objeto, assim como da densidade dos pontos extraĂ­dos desse objeto atravĂ©s de um scanner 3D particular. Na literatura, existem essencialmente duas abordagens na reconstrução de superfĂ­cies a partir de nuvens de pontos: interpolação e aproximação. Em geral, as abordagens de interpolação estĂŁo associadas aos mĂ©todos simpliciais, ou seja, a mĂ©todos que geram diretamente uma malha de triĂąngulos a partir de uma nuvem de pontos. Por outro lado, as abordagens de aproximação estĂŁo habitualmente associadas Ă  geração de uma função implĂ­cita global —que representa uma superfĂ­cie implĂ­cita— a partir de funçÔes locais de forma, para em seguida gerar uma triangulação da dita superfĂ­cie implĂ­cita. Os mĂ©todos simpliciais dividem‐se em duas famĂ­lias: triangulação de Delaunay e triangulação baseada em crescimento progressivo da triangulação (i.e., mesh growing). Tendo em conta que o primeiro dos mĂ©todos apresentados nesta dissertação se enquadra na categoria de mĂ©todos de crescimento progressivo, foquemos a nossa atenção por ora nestes mĂ©todos. Um dos maiores problemas destes mĂ©todos Ă© que, em geral, se baseiam no estabelecimento de limites de Ăąngulos diĂ©dricos (i.e., dihedral angle bounds) entre triĂąngulos adjacentes, para assim tomar a decisĂŁo sobre qual triĂąngulo acrescentar Ă  frente de expansĂŁo da malha. Tipicamente, tambĂ©m se usam limites para os Ăąngulos internos de cada triĂąngulo. No decorrer desta dissertação veremos como Ă© que este problema foi resolvido. O segundo algoritmo introduzido nesta dissertação tambĂ©m Ă© um mĂ©todo simplicial, mas nĂŁo se enquadra em nenhuma das duas famĂ­lias acima referidas, o que nos faz pensar que estaremos na presença de uma nova famĂ­lia: triangulação baseada em atlas de vizinhanças sobrepostas (i.e., atlas of charts) ou estrelas de triĂąngulos (i.e., triangle star). Este algoritmo gera um atlas da superfĂ­cie que Ă© constituĂ­do por estrelas sobrepostas de triĂąngulos, ou seja, produz‐se a cobertura total da superfĂ­cie, resolvendo assim um dos problemas comuns desta famĂ­lia de mĂ©todos de triangulação direta que Ă© o do surgimento de furos ou de triangulação incompleta da superfĂ­cie. O terceiro algoritmo refere‐se a um mĂ©todo implĂ­cito, mas, ao invĂ©s de grande parte dos mĂ©todos implĂ­citos, utiliza uma abordagem de interpolação. Ou seja, as funçÔes locais de forma interpolam os pontos da nuvem. É, talvez, um dos poucos mĂ©todos implĂ­citos que podemos encontrar na literatura que interpola todos os pontos da nuvem. Desta forma resolve‐se um dos maiores problemas dos mĂ©todos implĂ­citos que Ă© o do arredondamento de forma resultante do blending das funçÔes locais que geram a função global, em particular ao longo dos vincos da superfĂ­cie (i.e., sharp features). O que Ă© comum aos trĂȘs mĂ©todos Ă© a abordagem de interpolação, quer em mĂ©todos simpliciais quer em mĂ©todos implĂ­citos, ou seja a linearização da superfĂ­cie sujeita a reconstrução. Como se verĂĄ, a linearização da vizinhança de cada ponto permite‐nos resolver vĂĄrios problemas colocados aos algoritmos de reconstrução de superfĂ­cies, nomeadamente: sub‐amostragem de pontos (point sub‐sampling), amostragem nĂŁo uniforme (non‐uniform sampling), bem como formas vincadas (sharp features)

    Efficient Algorithms for Large-Scale Image Analysis

    Get PDF
    This work develops highly efficient algorithms for analyzing large images. Applications include object-based change detection and screening. The algorithms are 10-100 times as fast as existing software, sometimes even outperforming FGPA/GPU hardware, because they are designed to suit the computer architecture. This thesis describes the implementation details and the underlying algorithm engineering methodology, so that both may also be applied to other applications
    • 

    corecore