173 research outputs found

    Novel intelligent spatiotemporal grid earthquake early-warning model

    Get PDF
    The integration analysis of multi-type geospatial information poses challenges to existing spatiotemporal data organization models and analysis models based on deep learning. For earthquake early warning, this study proposes a novel intelligent spatiotemporal grid model based on GeoSOT (SGMG-EEW) for feature fusion of multi-type geospatial data. This model includes a seismic grid sample model (SGSM) and a spatiotemporal grid model based on a three-dimensional group convolution neural network (3DGCNN-SGM). The SGSM solves the problem concerning that the layers of different data types cannot form an ensemble with a consistent data structure and transforms the grid representation of data into grid samples for deep learning. The 3DGCNN-SGM is the first application of group convolution in the deep learning of multi-source geographic information data. It avoids direct superposition calculation of data between different layers, which may negatively affect the deep learning analysis model results. In this study, taking the atmospheric temperature anomaly and historical earthquake precursory data from Japan as an example, an earthquake early warning verification experiment was conducted based on the proposed SGMG-EEW. Five groups of control experiments were designed, namely with the use of atmospheric temperature anomaly data only, use of historical earthquake data only, a non-group convolution control group, a support vector machine control group, and a seismic statistical analysis control group. The results showed that the proposed SGSM is not only compatible with the expression of a single type of spatiotemporal data but can also support multiple types of spatiotemporal data, forming a deep-learning-oriented data structure. Compared with the traditional deep learning model, the proposed 3DGCNN-SGM is more suitable for the integration analysis of multiple types of spatiotemporal data.</jats:p

    A comparison of the least squares collocation and the fast Fourier transform methods for gravimetric geoid determination

    Get PDF
    The objective of the research was to study the performance of the least squares collocation (LSC) and the fast Fourier transform (FFT) techniques for gravimetric geoid computation. The Land Levelling Datum (LLD) is the South African vertical datum based on more than 100 years old tide gauge measurements of mean sea level (MSL). The LLD is poorly defined so an alternative is required. The SAGEOID10 (Merry, 2009) hybrid geoid model was computed for the purpose of replacing the existing vertical datum. Two gravimetric geoid models were computed using different techniques for evaluation of the Stokes' integral, such as, LSC and one dimensional fast Fourier transform (1D-FFT) technique. The long wavelength component of the geoid models were computed using the EGM2008 geopotential model truncated at degree 720. The use of fast spectral techniques is required due to an increase of both quality and type of data available for geoid determination. The FFT method is most reliable than the LSC method, since it requires less computational time on large data set than the LSC. A system of linear equations of order equal to the number of data points is generated on the LSC method. The geoid model was computed over the province of Gauteng. It was then compared to the SAGEOID10 hybrid geoid model. The computed geoid models, SiPLSC and SiPFFT geoid model compared to the SAGEOID10 model with standard deviation of 5.6cm. The long wavelength component of the computed geoid model compared to the EGM2008 geopotential geoid model with a standard deviation of 4.2cm

    Self-consistent propagation of flux ropes in realistic coronal simulations

    Full text link
    The aim of this paper is to demonstrate the possible use of the new coronal model COCONUT to compute a detailed representation of a numerical CME at 0.1~AU, after its injection at the solar surface and propagation in a realistic solar wind, as derived from observed magnetograms. We present the implementation and propagation of modified Titov-D\'emoulin (TDm) flux ropes in the COCONUT 3D MHD coronal model. The background solar wind is reconstructed in order to model two opposite configurations representing a solar activity maximum and minimum respectively. Both were derived from magnetograms which were obtained by the Helioseismic and Magnetic Imager (HMI) onboard the Solar Dynamic Observatory (SDO) satellite. We track the propagation of 24 flux ropes, which differ only by their initial magnetic flux. We especially investigate the geometry of the flux rope during the early stages of the propagation as well as the influence of its initial parameters and solar wind configuration on 1D profiles derived at 0.1~AU. At the beginning of the propagation, the shape of the flux ropes varies between simulations during low and high solar activity. We find dynamics that are consistent with the standard CME model, such as the pinching of the legs and the appearance of post-flare loops. Despite the differences in geometry, the synthetic density and magnetic field time profiles at 0.1~AU are very similar in both solar wind configurations. These profiles are similar to those observed further in the heliosphere and suggest the presence of a magnetic ejecta composed of the initially implemented flux rope and a sheath ahead of it. Finally, we uncover relationships between the properties of the magnetic ejecta, such as density or speed and the initial magnetic flux of our flux ropes.Comment: 20 pages, 13 figure

    Real-time Visual Flow Algorithms for Robotic Applications

    Get PDF
    Vision offers important sensor cues to modern robotic platforms. Applications such as control of aerial vehicles, visual servoing, simultaneous localization and mapping, navigation and more recently, learning, are examples where visual information is fundamental to accomplish tasks. However, the use of computer vision algorithms carries the computational cost of extracting useful information from the stream of raw pixel data. The most sophisticated algorithms use complex mathematical formulations leading typically to computationally expensive, and consequently, slow implementations. Even with modern computing resources, high-speed and high-resolution video feed can only be used for basic image processing operations. For a vision algorithm to be integrated on a robotic system, the output of the algorithm should be provided in real time, that is, at least at the same frequency as the control logic of the robot. With robotic vehicles becoming more dynamic and ubiquitous, this places higher requirements to the vision processing pipeline. This thesis addresses the problem of estimating dense visual flow information in real time. The contributions of this work are threefold. First, it introduces a new filtering algorithm for the estimation of dense optical flow at frame rates as fast as 800 Hz for 640x480 image resolution. The algorithm follows a update-prediction architecture to estimate dense optical flow fields incrementally over time. A fundamental component of the algorithm is the modeling of the spatio-temporal evolution of the optical flow field by means of partial differential equations. Numerical predictors can implement such PDEs to propagate current estimation of flow forward in time. Experimental validation of the algorithm is provided using high-speed ground truth image dataset as well as real-life video data at 300 Hz. The second contribution is a new type of visual flow named structure flow. Mathematically, structure flow is the three-dimensional scene flow scaled by the inverse depth at each pixel in the image. Intuitively, it is the complete velocity field associated with image motion, including both optical flow and scale-change or apparent divergence of the image. Analogously to optic flow, structure flow provides a robotic vehicle with perception of the motion of the environment as seen by the camera. However, structure flow encodes the full 3D image motion of the scene whereas optic flow only encodes the component on the image plane. An algorithm to estimate structure flow from image and depth measurements is proposed based on the same filtering idea used to estimate optical flow. The final contribution is the spherepix data structure for processing spherical images. This data structure is the numerical back-end used for the real-time implementation of the structure flow filter. It consists of a set of overlapping patches covering the surface of the sphere. Each individual patch approximately holds properties such as orthogonality and equidistance of points, thus allowing efficient implementations of low-level classical 2D convolution based image processing routines such as Gaussian filters and numerical derivatives. These algorithms are implemented on GPU hardware and can be integrated to future Robotic Embedded Vision systems to provide fast visual information to robotic vehicles

    Tropical daylighting : predicting sky types and interior illuminance in north-east Brazil.

    Get PDF
    Daylight is present in tropical regions in a considerable intensity throughout the year. The sky characteristics are changeable and sunlight cannot be disregarded. Daylighting techniques are still wanted to answer particular tropical features. The main aim of this thesis is to present a daylighting analysis tool for the tropics developed out of existing procedures. It is structured in three parts. The first part provides a broad view of climatic aspects related to daylighting studies in a typical tropical city - Maceiö, Brazil. A brief climatic description of the city and a study relating climate and building are followed by a literature review of climatic fundamentals. A study is made of meteorological station measurements in relation to the city and a field investigation is described. These lead to a simplified method for sky type selection. It shows that a reasonable assumption about daylight climate can be made from very simple data and that new structure of CIE standard general sky could be applied everywhere. The second part investigates methods that could be appropriated for calculating daylighting in humid climates and concludes with a methodology based on an adaptation of existing techniques. The Monte Carlo and ray tracing techniques are reviewed, as well as the daylight coefficients concept. These are incorporated in prototype software, TropLux, written in MATLAB code. The development of the method in this thesis can be seen as an extension of the daylight factor concept to the CIE Standard General Sky and reflected sunlight. The software validation is done and results show that the level of prediction is comparable with those produced by Radiance and overall the results appear to be robust. Analysis indicates that it is not essential to have climate-specific calculation technique. Universal lighting software is viable, providing the local climate and architectural characteristics are taken into account. The last part applies TropLux to ground-reflected light. It is found that the influence of reflected sunlight on interior illuminance can be very large. Among shading devices analysed, overhang has shown the best performance. There is a key zone of ground outside window that provides the majority of the reflected light. A direct design implication can be the reduction of window size

    Compression Methods for Structured Floating-Point Data and their Application in Climate Research

    Get PDF
    The use of new technologies, such as GPU boosters, have led to a dramatic increase in the computing power of High-Performance Computing (HPC) centres. This development, coupled with new climate models that can better utilise this computing power thanks to software development and internal design, led to the bottleneck moving from solving the differential equations describing Earth’s atmospheric interactions to actually storing the variables. The current approach to solving the storage problem is inadequate: either the number of variables to be stored is limited or the temporal resolution of the output is reduced. If it is subsequently determined that another vari- able is required which has not been saved, the simulation must run again. This thesis deals with the development of novel compression algorithms for structured floating-point data such as climate data so that they can be stored in full resolution. Compression is performed by decorrelation and subsequent coding of the data. The decorrelation step eliminates redundant information in the data. During coding, the actual compression takes place and the data is written to disk. A lossy compression algorithm additionally has an approx- imation step to unify the data for better coding. The approximation step reduces the complexity of the data for the subsequent coding, e.g. by using quantification. This work makes a new scientific contribution to each of the three steps described above. This thesis presents a novel lossy compression method for time-series data using an Auto Regressive Integrated Moving Average (ARIMA) model to decorrelate the data. In addition, the concept of information spaces and contexts is presented to use information across dimensions for decorrela- tion. Furthermore, a new coding scheme is described which reduces the weaknesses of the eXclusive-OR (XOR) difference calculation and achieves a better compression factor than current lossless compression methods for floating-point numbers. Finally, a modular framework is introduced that allows the creation of user-defined compression algorithms. The experiments presented in this thesis show that it is possible to in- crease the information content of lossily compressed time-series data by applying an adaptive compression technique which preserves selected data with higher precision. An analysis for lossless compression of these time- series has shown no success. However, the lossy ARIMA compression model proposed here is able to capture all relevant information. The reconstructed data can reproduce the time-series to such an extent that statistically rele- vant information for the description of climate dynamics is preserved. Experiments indicate that there is a significant dependence of the com- pression factor on the selected traversal sequence and the underlying data model. The influence of these structural dependencies on prediction-based compression methods is investigated in this thesis. For this purpose, the concept of Information Spaces (IS) is introduced. IS contributes to improv- ing the predictions of the individual predictors by nearly 10% on average. Perhaps more importantly, the standard deviation of compression results is on average 20% lower. Using IS provides better predictions and consistent compression results. Furthermore, it is shown that shifting the prediction and true value leads to a better compression factor with minimal additional computational costs. This allows the use of more resource-efficient prediction algorithms to achieve the same or better compression factor or higher throughput during compression or decompression. The coding scheme proposed here achieves a better compression factor than current state-of-the-art methods. Finally, this paper presents a modular framework for the development of compression algorithms. The framework supports the creation of user- defined predictors and offers functionalities such as the execution of bench- marks, the random subdivision of n-dimensional data, the quality evalua- tion of predictors, the creation of ensemble predictors and the execution of validity tests for sequential and parallel compression algorithms. This research was initiated because of the needs of climate science, but the application of its contributions is not limited to it. The results of this the- sis are of major benefit to develop and improve any compression algorithm for structured floating-point data

    Development and evaluation of methodologies for monitoring droughts and their impacts on agriculture in data-scarce áreas

    Get PDF
    Tesis por compendio de publicación[Abstract] Drought is one of the natural phenomena that causes the greatest socio-economic and environmental damage. Its impacts are of particular importance in agriculture, as this activity is closely linked to food security and quality of life in many territories. Droughts can occur in any climatic regime in the world, with arid and semi-arid areas being the most affected and prone to drought events. In regions particularly exposed and vulnerable to drought, specific drought studies are needed to help manage and mitigate its impacts. This thesis is a contribution to the management of drought and its impacts, specifically on agriculture. Several novel and bespoke methodologies were developed with the aim of increasing knowledge of drought phenomena and providing solutions for water resources and drought management. Freely available global scale hydrometeorological data sources were used, so that the methodologies can be applied to any country or region of the world. The case studies were Mozambique and Argentina, both are developing countries with significant agricultural activity (in terms of cropland extension) and prone to drought events. Methodologies focused on defining and understanding the spatio-temporal characteristics of droughts; defining and relating drought events to their triggers; validating tools for monitoring droughts and their impacts on agricultural activity; and knowledge transfer to all beneficiaries and stakeholders involved in drought management in data-scarce regions. The methodologies are of general applicability and can be replicated worldwide, providing meaningful information to the scientific, technical and management community to develop, calibrate or validate existing and new formulations. In addition, they could contribute to the creation of drought mitigation and adaptation plans aimed at reducing impacts, especially in agriculture.[Resumen] La sequía es uno de los fenómenos naturales que mayores daños socioeconómicos y medioambientales causa. Sus impactos son de especial importancia en la agricultura, ya que esta actividad está ligada a la seguridad alimentaria y calidad de vida de muchos territorios. Las sequías pueden ocurrir en cualquier régimen climático del mundo, siendo las zonas áridas y semiáridas las más afectadas y propensas a eventos de sequía en el futuro. En las regiones particularmente expuestas y vulnerables a la sequía, se necesitan estudios específicos sobre la sequía para ayudar a controlar y mitigar sus impactos. Esta tesis es una contribución a la gestión de las sequías y sus impactos, específicamente en la agricultura. Se desarrollaron varias metodologías específicas y novedosas con el objetivo de aumentar el conocimiento de los fenómenos de la sequía y aportar soluciones para la gestión de los recursos hídricos y de la sequía. Se hizo uso de fuentes de datos hidrometeorológicos alternativos de libre acceso, de manera que las metodologías pueden aplicarse a cualquier país o región del mundo y a cualquier escala espacial. Los casos de estudio fueron países en vías de desarrollo con una importante actividad agrícola (extensión de cultivos) y propensos a eventos de sequía. Se uso Mozambique y Argentina debido a su situación económica y compleja disponibilidad de datos. Las metodologías se centraron en definir y comprender las características espaciotemporales de las sequías; en definir y relacionar los eventos de sequía con sus desencadenantes; en la validación de herramientas para el seguimiento de las sequías y sus impactos en la actividad agrícola; y, en la transferencia de conocimientos a todos los beneficiarios e implicados en la gestión de la sequía en regiones con escases de datos. Las metodologías y los resultados obtenidos pueden ser replicados en cualquier parte del mundo, proporcionando información significativa a la comunidad científica, técnica y de gestión para desarrollar, calibrar o validar formulaciones existentes y nuevas. Además, son herramientas que podrían contribuir a la creación de planes de mitigación y adaptación a la sequía destinados a reducir los impactos, especialmente en la agricultura.[Resumo] A seca é un dos fenómenos naturais que provocan maiores danos socioeconómicos e ambientais. Os seus impactos son de especial importancia no agro, xa que esta actividade está moi ligada á seguridade alimentaria e á calidade de vida en moitos territorios. As secas poden ocorrer en calquera réxime climático do mundo, sendo as zonas áridas e semiáridas as máis afectadas e propensas a sufrir eventos de seca. Nas rexións especialmente expostas e vulnerables á seca, son necesarios estudos específicos sobre a seca para axudar a xestionar e mitigar os seus impactos. Esta tese é unha contribución á xestión da seca e os seus impactos, concretamente na agricultura. Desenvolvéronse varias metodoloxías novedosas e a medida co obxectivo de aumentar o coñecemento dos fenómenos da seca e proporcionar solucións para os recursos hídricos e a xestión da seca. Utilizáronse fontes de datos hidrometeorolóxicos a escala global de libre dispoñibilidade, de xeito que as metodoloxías poden aplicarse a calquera país ou rexión do mundo. Os estudos de caso foron países en desenvolvemento con actividade agrícola importante (extensión de terras de cultivo) e propensos a eventos de seca. Utilizáronse Mozambique e Arxentina debido á súa situación económica e á complexa dispoñibilidade de datos. Metodoloxías enfocadas a definir e comprender as características espazo-temporais das secas; definir e relacionar os eventos de seca cos seus desencadenantes; validación de ferramentas para o seguimento das secas e os seus impactos na actividade agraria; e, transferencia de coñecemento a todos os beneficiarios e partes interesadas implicadas na xestión da seca en rexións con escaseza de datos. As metodoloxías son de aplicabilidade xeral e pódense replicar en todo o mundo, proporcionando información significativa á comunidade científica, técnica e de xestión para desenvolver, calibrar ou validar formulacións existentes e novas. Ademais, poderían contribuír á creación de plans de mitigación e adaptación á seca destinados a reducir os impactos, especialmente no agro.Xunta de Galicia; ED481A- 2018/16

    Quality of Service Aware Data Stream Processing for Highly Dynamic and Scalable Applications

    Get PDF
    Huge amounts of georeferenced data streams are arriving daily to data stream management systems that are deployed for serving highly scalable and dynamic applications. There are innumerable ways at which those loads can be exploited to gain deep insights in various domains. Decision makers require an interactive visualization of such data in the form of maps and dashboards for decision making and strategic planning. Data streams normally exhibit fluctuation and oscillation in arrival rates and skewness. Those are the two predominant factors that greatly impact the overall quality of service. This requires data stream management systems to be attuned to those factors in addition to the spatial shape of the data that may exaggerate the negative impact of those factors. Current systems do not natively support services with quality guarantees for dynamic scenarios, leaving the handling of those logistics to the user which is challenging and cumbersome. Three workloads are predominant for any data stream, batch processing, scalable storage and stream processing. In this thesis, we have designed a quality of service aware system, SpatialDSMS, that constitutes several subsystems that are covering those loads and any mixed load that results from intermixing them. Most importantly, we natively have incorporated quality of service optimizations for processing avalanches of geo-referenced data streams in highly dynamic application scenarios. This has been achieved transparently on top of the codebases of emerging de facto standard best-in-class representatives, thus relieving the overburdened shoulders of the users in the presentation layer from having to reason about those services. Instead, users express their queries with quality goals and our system optimizers compiles that down into query plans with an embedded quality guarantee and leaves logistic handling to the underlying layers. We have developed standard compliant prototypes for all the subsystems that constitutes SpatialDSMS

    Terrainosaurus: realistic terrain synthesis using genetic algorithms

    Get PDF
    Synthetically generated terrain models are useful across a broad range of applications, including computer generated art & animation, virtual reality and gaming, and architecture. Existing algorithms for terrain generation suffer from a number of problems, especially that of being limited in the types of terrain that they can produce and of being difficult for the user to control. Typical applications of synthetic terrain have several factors in common: first, they require the generation of large regions of believable (though not necessarily physically correct) terrain features; and second, while real-time performance is often needed when visualizing the terrain, this is generally not the case when generating the terrain. In this thesis, I present a new, design-by-example method for synthesizing terrain height fields. In this approach, the user designs the layout of the terrain by sketching out simple regions using a CAD-style interface, and specifies the desired terrain characteristics of each region by providing example height fields displaying these characteristics (these height fields will typically come from real-world GIS data sources). A height field matching the user's design is generated at several levels of detail, using a genetic algorithm to blend together chunks of elevation data from the example height fields in a visually plausible manner. This method has the advantage of producing an unlimited diversity of reasonably realistic results, while requiring relatively little user effort and expertise. The guided randomization inherent in the genetic algorithm allows the algorithm to come up with novel arrangements of features, while still approximating user-specified constraints

    A Methodology for Natural Resources Analysis Appropriate for County Level Planning

    Get PDF
    In this thesis a methodology for developing an integrated cumulative analysis of sensitive natural resources was developed. Themes of natural resources-waterways, wetlands, forested lands, prime agricultural soils, and steep slopes-were brought together in a GIS system, in a grid format, in a manner so that each cell of the grid accumulated value according to the increasing presence of resource themes. For example, an area (30 meter x 30 meter grid cell) containing only one of the above themes is given a value of l, whereas an area containing slopes, streams, and forests might, after weighting factors, have a value of 5. The result is a map that demonstrates the cumulative value of sensitivity of a given area and its relative relation to the landscape under analysis. The methodology uses off-the-shelf GIS software and available GIS data sources, and is designed to require a minimum of technical and financial resources. This methodology is particularly useful for counties in Tennessee in meeting the requirements of Public Chapter 1101, the Growth Policy Act. The case study for this thesis reveals that much development does, in fact, occur in sensitive natural areas and that, therefore, this tool could be well utilized by planners to inform the public and to assist in the development of policy aimed toward the protection of sensitive areas from activities that would reduce their capacity to serve their natural functions
    corecore