105 research outputs found

    Hierarchical and Adaptive Filter and Refinement Algorithms for Geometric Intersection Computations on GPU

    Get PDF
    Geometric intersection algorithms are fundamental in spatial analysis in Geographic Information System (GIS). This dissertation explores high performance computing solution for geometric intersection on a huge amount of spatial data using Graphics Processing Unit (GPU). We have developed a hierarchical filter and refinement system for parallel geometric intersection operations involving large polygons and polylines by extending the classical filter and refine algorithm using efficient filters that leverage GPU computing. The inputs are two layers of large polygonal datasets and the computations are spatial intersection on pairs of cross-layer polygons. These intersections are the compute-intensive spatial data analytic kernels in spatial join and map overlay operations in spatial databases and GIS. Efficient filters, such as PolySketch, PolySketch++ and Point-in-polygon filters have been developed to reduce refinement workload on GPUs. We also showed the application of such filters in speeding-up line segment intersections and point-in-polygon tests. Programming models like CUDA and OpenACC have been used to implement the different versions of the Hierarchical Filter and Refine (HiFiRe) system. Experimental results show good performance of our filter and refinement algorithms. Compared to standard R-tree filter, on average, our filter technique can still discard 76% of polygon pairs which do not have segment intersection points. PolySketch filter reduces on average 99.77% of the workload of finding line segment intersections. Compared to existing Common Minimum Bounding Rectangle (CMBR) filter that is applied on each cross-layer candidate pair, the workload after using PolySketch-based CMBR filter is on average 98% smaller. The execution time of our HiFiRe system on two shapefiles, namely USA Water Bodies (contains 464K polygons) and USA Block Group Boundaries (contains 220K polygons), is about 3.38 seconds using NVidia Titan V GPU

    Increased Functionality of Floodplain Mapping Automation: Utah Inundation Mapping System (UTIMS)

    Get PDF
    Flood plain mapping has become an increasingly important part of flood plain management. Flood plain mapping employs mapping software and hydraulic calculation packages to efficiently map flood plains. Modelers often utilize automation software to develop the complex geometries required to reduce the time to develop hydraulic models. The Utah Inundation Mapping System (UTIMS) is designed to reduce the time required to develop complex geometries for use in flood plain mapping studies. The automated geometries developed by UTIMS include: flood specific river centerlines, bank lines, flow path lines, cross sections and areal averaged n-value polygons. UTIMS thus facilitates developing automated input to US Army Corps of Engineer\u27s HEC-RAS software. Results from HEC-RAS can be imported back to UTIMS for display and mapping. The user can also specify convergence criteria for water surface profile at selected locations along the river and thus run UTIMS and HEC-RAS iteratively till the convergence criterion is met. UTIMS develops a new flood specific geometry file for each iteration, enabling an accurate modeling of flood-plain. Utilizing this robust and easy to operate software within the GIS environment modelers can significantly reduce the time required to develop accurate flood plain maps. The time thus saved in developing the geometries allows modelers to spend more time doing the actual modeling and analyzing results. The time thus saved can also result in faster turn around and potential cost cutting in flood-plain modeling work. In this paper the authors describe UTIMS capabilities, compare them with other available software, and demonstrate the UTIMS flood plain automation process using a case study

    Arbitrary topology meshes in geometric design and vector graphics

    Get PDF
    Meshes are a powerful means to represent objects and shapes both in 2D and 3D, but the techniques based on meshes can only be used in certain regular settings and restrict their usage. Meshes with an arbitrary topology have many interesting applications in geometric design and (vector) graphics, and can give designers more freedom in designing complex objects. In the first part of the thesis we look at how these meshes can be used in computer aided design to represent objects that consist of multiple regular meshes that are constructed together. Then we extend the B-spline surface technique from the regular setting to work on extraordinary regions in meshes so that multisided B-spline patches are created. In addition, we show how to render multisided objects efficiently, through using the GPU and tessellation. In the second part of the thesis we look at how the gradient mesh vector graphics primitives can be combined with procedural noise functions to create expressive but sparsely defined vector graphic images. We also look at how the gradient mesh can be extended to arbitrary topology variants. Here, we compare existing work with two new formulations of a polygonal gradient mesh. Finally we show how we can turn any image into a vector graphics image in an efficient manner. This vectorisation process automatically extracts important image features and constructs a mesh around it. This automatic pipeline is very efficient and even facilitates interactive image vectorisation

    GPU Rasterization for Real-Time Spatial Aggregation over Arbitrary Polygons

    Get PDF
    Visual exploration of spatial data relies heavily on spatial aggregation queries that slice and summarize the data over different regions. These queries comprise computationally-intensive point-in-polygon tests that associate data points to polygonal regions, challenging the responsiveness of visualization tools. This challenge is compounded by the sheer amounts of data, requiring a large number of such tests to be performed. Traditional pre-aggregation approaches are unsuitable in this setting since they fix the query constraints and support only rectangular regions. On the other hand, query constraints are defined interactively in visual analytics systems, and polygons can be of arbitrary shapes. In this paper, we convert a spatial aggregation query into a set of drawing operations on a canvas and leverage the rendering pipeline of the graphics hardware (GPU) to enable interactive response times. Our technique trades-off accuracy for response time by adjusting the canvas resolution, and can even provide accurate results when combined with a polygon index. We evaluate our technique on two large real-world data sets, exhibiting superior performance compared to index-based approaches

    A Prototype Method for Storing Symbols for Multiple Maps in a Single Geodatabase Using ArcGIS Cartographic Representations

    Get PDF
    ArcGIS 9.2 software, released in late 2006, introduced a new way for ESRI users to store symbology in the geodatabase. This new method, called cartographic representations, presents new challenges for those individuals involved in producing high-quality maps from the GIS. These challenges include developing new workflows which incorporate the new technology. The project methodology used an existing geodatabase and a test set of hard copy maps as a base from which to develop a prototype methodology to implement cartographic representations. The main purpose of the project was to discover how feature symbols for multiple map products could be stored within a single geodatabase. In the course of the research, new techniques and functionality available with cartographic representations were evaluated against the standard ArcMap symbol management tools

    GIS based modelling for fuel reduction using controlled burn in Australia : case study : Logan City, Queensland

    Get PDF
    Bushfire problem is a long-lasting problem which is a big threat and environmental problem in Australia. Planning to control bushfire is very important for Australian Environment. One of the most effective methods to fight bushfire disasters is planning for controlled burns in order to reduce the risk of unwanted bushfire events. Controlled burns management and planning has been always considered as important by town planners. In this study the aim is to produce a tool for prioritizing burn blocks based on diffract criteria in order to help planners have a sound scientific basis for choosing the most important blocks to have controlled burn on. In this study the following research tasks have been considered 1. Investigate criteria related to prescribed burn management and their usability to design a model for analysing long term geospatial suitability of bushfire prescribed burns. 2. Finding out suitable model for scoring blocks designated as fuel reduction bushfire prescribed burns blocks in long term 3. Testing model in a pilot area Several criteria for building up a multi-criteria analysis with GIS model were studied and the corresponding importance weight for them were debated. Research methodology used in this section was investigating literature and methods for determining weights and possibly, using experts’ ideas by interviews or small surveys or running focus groups in a stakeholder organization to find out the most relevant and the most important criteria. Finally eleven most important criteria were chosen and compared to each other by interviewees to find out their importance weight. The model developed considers all the criteria which is usable to plan and prioritize burn blocks selected in the criteria analysis phase. This model works as a basis for having a sound and robust decision on which blocks are most suitable to be burnt in long term point of view. GIS database used in this model were acquired from the pilot area’s relevant authorities. Model was developed based on the ESRI’s ArcGIS analysis tools as well as ArcGIS Spatial Analyst extension. In this model Analytical Hierarchical Process Methodology was used for combining criteria importance and develop a unified value-based solution to the study’s Multi Criteria Analysis problem based on two main themes of ‘Implementation’ and ‘Safety’. Model was tested on Logan City Area in south of Queensland, Australia. The case study is an administration area within Australia that all the criteria data has been prepared and acquired from. Results: As combining the final results by overlaying can cause some bias as some blocks show a good match for safety theme but not a good match for implementation and vice versa, two main themes results were combined using an optimization methodology based on probabilistic principles for generating final prioritized blocks. The usability test of the result generated by this model was done by Logan City Council managers and Parks Department bushfire experts. The suitability of the blocks was very close to what experts had in their minds and this model results were validated completely satisfactory by them. All of the blocks ranked by the model were according to what they had a practical perception from the field visit and field knowledge. In overall and in general, the tool created by this study, will help decision makers has a good basis for deciding about long term priorities to plan for controlled burn activities. Decision makers could use this model to have a long term outlook for the budget and resources needed to be allocated to fuel reduction controlled burn practices. This will facilitate short term planning as well.Bushfire problem is a long-lasting problem which is a big threat and environmental problem in Australia. Planning to control bushfire is very important for Australian Environment. One of the most effective methods to fight bushfire disasters is planning for controlled burns in order to reduce the risk of unwanted bushfire events. In controlled burn, some patches or blocks which are risky to cause threat to environment and humans are selected and burned deliberately under a very safe and controlled condition. This way it is ensured that in real situations the ready-to-burn barks and tree canopy or simply ‘fuel load’ are eliminated from the area. This research aims to investigate different approaches to build up spatial model to aid decision makers have a rational justifications for planning controlled burns in long term. This includes finding out suitable model for scoring blocks designated as bushfire prescribed burns blocks. The target of this research is to investigate suitability criteria related to prescribed burn management and use them to design a model for analysing spatial suitability for bushfire prescribed burns. In the process of this research, first it is tried to find out how prescribed burn programs work, what characteristics a burn plan has and how different criteria may contribute in forming suitability for performing a prescribed burn. Then a model has been developed for this purpose. The model output is the prioritized blocks based on two main themes of ‘Safety’ and ‘Implementation’. A combination of these two themes has been used in order to generate prioritized blocks. In this output the higher is the rank of a block it means that it has higher priority to be burn first in long term planning. The model was tested in Logan City area in South East Queensland Australia. Finally the outcome showed a good agreement between planners suitability choice which was based on field visits and the prioritized blocks generated by model. This agreement was investigated gathering different decision makers’ opinions regarding different blocks and comparing it with the actual model outcome. In overall and in general, the tool created by this study, will help decision makers has a good basis for deciding about long term priorities to plan for controlled burn activities. Decision makers could use this model to have a long term outlook for the budget and resources needed to be allocated to fuel reduction controlled burn practices. This will facilitate short term planning as well

    Desenvolvemento de modelos de información de infraestructuras segundo estándares abertos e parametrización automática a partir de datos xeomáticos.

    Get PDF
    It seeks to develop procedures that allow generating information models of these structures, created from the relevant information of the point clouds obtained with these systems. For this purpose, the BIM standards for civil engineering structures, both currently available and those that will be published for the duration of the thesis, will be exploited and adopted. Information modeling techniques will be used in these standards, with the aim of obtaining a system that allows modeling the structures automatically. The models will also be made compatible with other methodologies designed for BIM, whose purpose is to take full advantage of the information available for management and maintenance tasks. Meeting these objectives, an automatic modeling system will be developed according to the BIM standards for transport infrastructures, suitable for automatic feeding from geomatic data and remote sensing, which is in turn integrable into management and maintenance systems for these types of structures of civil engineering.Esta tesis busca el desarrollo de metodologías para la exportación de la información geomática de infraestructuras de transporte, particularmente estructuras ferroviarias y carreteras, obtenida mediante tecnologías de mapeado móvil. Se busca desarrollar procedimientos que permitan generar modelos de información de estas estructuras, creados a partir de la información relevante de las nubes de puntos obtenidas con estos sistemas. Con este propósito, se explotarán y adoptarán los estándares BIM para estructuras de ingeniería civil, tanto los actualmente disponibles como aquellos que serán publicados durante la duración de la tesis. Se utilizarán técnicas de modelado de información en estos estándares, con objetivo de obtener un sistema que permita realizar un modelado de las estructuras de manera automática. Se llevará a cabo también la compatibilización los modelos con otras metodologías diseñadas para BIM, cuyo propósito es el aprovechamiento total de la información disponible para tareas de gestión y mantenimiento. Cumpliendo estos objetivos se desarrollará un sistema automático de modelado según los estándares BIM para infraestructuras de transporte, apto para su alimentación automática a partir de datos geomáticos y teledetección, el cual es a su vez integrable en sistemas de gestión y mantenimiento para este tipo de estructuras de ingeniería civil.Esta tese busca o desenvolvemento de metodoloxías para a exportación da información xeomática de infraestruturas de transporte, particularmente estruturas ferroviarias e estradas, obtida mediante tecnoloxías de mapeado móbil. A tese busca o desenvolvemento de procedementos que permitan xerar modelos de información destas estruturas, creados a partir da información relevante das nubes de puntos obtidas con estes sistemas. Con este propósito, se explotarán e adoptarán os estándares BIM para estruturas de enxeñería civil, tanto os actualmente dispoñibles como aqueles que serán publicados durante a duración da tese. Utilizaranse técnicas de modelado de información nestes estándares, con obxectivo de obter un sistema que permita realizar un modelado das estruturas de maneira automática. Levarase a cabo tamén a compatibilización dos modelos con outras metodoloxías diseñadas para BIM, cuxo propósito é o aproveitamento total da información dispoñible para tarefas de xestión e mantemento. Cumplindo estes obxectivos se desenvolverá un sistema automático de modelado segundo os estándares BIM para infraestruturas de transporte, apto para a súa alimentación automática a partir de datos xeomáticos e teledetección, o cal é a súa vez integrable en sistemas de xestión e mantemento para este tipo de estruturas de enxeñería civil

    A query processing system for very large spatial databases using a new map algebra

    Get PDF
    Dans cette thèse nous introduisons une approche de traitement de requêtes pour des bases de donnée spatiales. Nous expliquons aussi les concepts principaux que nous avons défini et développé: une algèbre spatiale et une approche à base de graphe utilisée dans l'optimisateur. L'algèbre spatiale est défini pour exprimer les requêtes et les règles de transformation pendant les différentes étapes de l'optimisation de requêtes. Nous avons essayé de définir l'algèbre la plus complète que possible pour couvrir une grande variété d'application. L'opérateur algébrique reçoit et produit seulement des carte. Les fonctions reçoivent des cartes et produisent des scalaires ou des objets. L'optimisateur reçoit la requête en expression algébrique et produit un QEP (Query Evaluation Plan) efficace dans deux étapes: génération de QEG (Query Evaluation Graph) et génération de QEP. Dans première étape un graphe (QEG) équivalent de l'expression algébrique est produit. Les règles de transformation sont utilisées pour transformer le graphe a un équivalent plus efficace. Dans deuxième étape un QEP est produit de QEG passé de l'étape précédente. Le QEP est un ensemble des opérations primitives consécutives qui produit les résultats finals (la réponse finale de la requête soumise au base de donnée). Nous avons implémenté l'optimisateur, un générateur de requête spatiale aléatoire, et une base de donnée simulée. La base de donnée spatiale simulée est un ensemble de fonctions pour simuler des opérations spatiales primitives. Les requêtes aléatoires sont soumis à l'optimisateur. Les QEPs générées sont soumis au simulateur de base de données spatiale. Les résultats expérimentaux sont utilisés pour discuter les performances et les caractéristiques de l'optimisateur.Abstract: In this thesis we introduce a query processing approach for spatial databases and explain the main concepts we defined and developed: a spatial algebra and a graph based approach used in the optimizer. The spatial algebra was defined to express queries and transformation rules during different steps of the query optimization. To cover a vast variety of potential applications, we tried to define the algebra as complete as possible. The algebra looks at the spatial data as maps of spatial objects. The algebraic operators act on the maps and result in new maps. Aggregate functions can act on maps and objects and produce objects or basic values (characters, numbers, etc.). The optimizer receives the query in algebraic expression and produces one efficient QEP (Query Evaluation Plan) through two main consecutive blocks: QEG (Query Evaluation Graph) generation and QEP generation. In QEG generation we construct a graph equivalent of the algebraic expression and then apply graph transformation rules to produce one efficient QEG. In QEP generation we receive the efficient QEG and do predicate ordering and approximation and then generate the efficient QEP. The QEP is a set of consecutive phases that must be executed in the specified order. Each phase consist of one or more primitive operations. All primitive operations that are in the same phase can be executed in parallel. We implemented the optimizer, a randomly spatial query generator and a simulated spatial database. The query generator produces random queries for the purpose of testing the optimizer. The simulated spatial database is a set of functions to simulate primitive spatial operations. They return the cost of the corresponding primitive operation according to input parameters. We put randomly generated queries to the optimizer, got the generated QEPs and put them to the spatial database simulator. We used the experimental results to discuss on the optimizer characteristics and performance. The optimizer was designed for databases with a very large number of spatial objects nevertheless most of the concepts we used can be applied to all spatial information systems."--Résumé abrégé par UMI
    • …
    corecore