19 research outputs found
Recommended from our members
AEGIS - A state-of-the art component based spatio-temporal framework for education and research
In past years, geoinformation has gained a significant role in information technology due to the spread of GPS localization, navigation systems and the publication of geographical data via Internet. The inclusion of semantic information and temporal alteration has also become increasingly important in GIS. The overwhelming amount of spatial and spatiotemporal data resulted in increased research effort on processing algorithms and efficient data management solutions. This article presents the AEGIS framework, a currently developed spatio-temporal data management system at the Eötvös Loránd University, Faculty of Informatics (ELTE IK). This framework will serve as the future platform of GIS education and research at ELTE IK. It aims to introduce a data model for the uniform representation of raster and vector data with temporal references; to enable efficient data management using specialized indexing; and to support internal revision control management of editing operations. The framework offers a data processing engine that automatically transforms operations for distributed execution using GPGPUs, allows fast operations even with large datasets and high scalability with regard to new methods. To demonstrate the usage of the system two prototype applications – segment-based image classification and agent-based traffic simulation – are also presented
A study of various techniques for storing inhomogeneous descriptive data of digital maps
In contemporary Geographical Information Systems (GIS) the large variety of digital map sources requires the handling and storing of various descriptive data in a single storage facility. Although the number and type of attributes can vary among different maps, the storage of such inhomogeneous data in a single database is difficult, as querying is an essential task and requires fast retrieval of data based on any present attribute.In this study, the authors compare three different approaches to this problem based on relational and objectoriented database systems by implementing and testing with massive inhomogeneous and altering descriptive data. Since this problem is not only typical in the field of GIS, the solution can be applied generally to any domain using inhomogeneous data, like e-commerce systems and document warehouses
A case study of advancing remote sensing image analysis
Big data and cloud computing are two phenomena, which have gained significant reputation over the last few years. In computer science the approach shifted towards distributed architectures and high performance computing. In case of geographical information systems (GIS) and remote sensing image analysis, the new paradigms have already been successfully applied to several problems, and systems have been developed to support processing of geographical and remote sensing data in the cloud. However, due to different circumstances many previous workflows have to be reconsidered and redesigned. Our goal is to show a way how the existing approaches to remote sensing image analysis can be advanced to take advantages of these new paradigms. The task aiming in shifting the algorithms shall require a moderate effort and must avoid the complete redesign and reimplementation of the existing approaches. We present the whole journey as a case study using an existing industrial workflow for demonstration. Nevertheless, we define the rules of thumb, which can come in hand when shifting any existing GIS workflows. Our case study is the workflow of waterlogging and flood detection, which is an operative task at the Institute of Geodesy, Cartography and Remote Sensing (FĂ–MI). This task in currently operational using a semi-automatic single machine approach involving multiple software. The workflow is neither efficient nor scalable, thus it is not applicable in emergency situations where quick response is required. We present an approach utilizing distributed computing, which enables the automated execution of this task on large input data
with much better response time. The approach is based on the well-known MapReduce paradigm, its open-source implementation, the Apache Hadoop framework and the AEGIS geospatial toolkit. This enables the replacement of multiple software to a single, generic framework. Results show that significant performance benefits can be achieved at the expense of minor accuracy loss
Webes alkalmazások fejlesztése
A jegyzet az ELTE Informatikai Karának 2016. évi jegyzetpályázatának támogatásával készült
SzoftvertechnolĂłgia
A jegyzet az ELTE Informatikai Karának 2016. évi jegyzetpályázatának támogatásával készült
Alkalmazott modul: Programozás
A jegyzet az ELTE Informatikai Karának 2015. évi Jegyzetpályázatának támogatásával készült
Eseményvezérelt alkalmazások fejlesztése I
A jegyzet az ELTE Informatikai Karának 2014. évi Jegyzetpályázatának támogatásával készült
Big Geospatial Data processing in the IQmulus Cloud
Remote sensing instruments are continuously evolving in terms of spatial, spectral and temporal resolutions and hence provide exponentially increasing amounts of raw data. These volumes increase significantly faster than computing speeds. All these techniques record lots of data, yet in different data models and representations; therefore, resulting datasets require harmonization and integration prior to deriving meaningful information from them. All in all, huge datasets are available but raw data is almost of no value if not processed, semantically enriched and quality checked. The derived information need to be transferred and published to all level of possible users (from decision makers to citizens). Up to now, there are only limited automatic procedures for this; thus, a wealth of information is latent in many datasets. This paper presents the first achievements of the IQmulus EU FP7 research and development project with respect to processing and analysis of big geospatial data in the context of flood and waterlogging detection