9 research outputs found

    Storing spatial data in multiuser level and tracking history in database

    Get PDF
    Coğrafi bilgi sistemlerinde veri tabanlarının artan kullanımı, veri tabanlarının sağlamış olduğu diğer faydaların da sistem içerisinde değerlendirilmesi fikrini oluşturmuştur. Bu faydalardan bir tanesi kullanıcılara çok kullanıcılı ortamın sunulması, bir diğeri ise verilerin özgeçmişlerinin takibidir. Coğrafi bilgi sisteminin herhangi bir aşamasında birden fazla kullanıcının eş zamanlı olarak aynı veri üzerinde güncelleme yapma ihtiyacı olabilmektedir. Bu durumda kullanılan veri tabanı yönetim sisteminin, kullanıcılarına verilerin birden fazla kopyasını almadan aynı anda güncelleme imkânı sağlaması gerekmektedir. Coğrafi bilgi sistemlerinde mekânsal bilgi çoğunlukla konum ve öznitelik bilgisi bileşenlerine ayrılmış ve zaman bileşeni sisteme dâhil edilmemiştir. Zaman bilgisinin de sisteme dâhil edilmesi coğrafi bilgi sistemini temsil ettiği gerçek dünyaya daha yakınlaştıracak ve mekânsal planlama ve karar vermede daha güvenli bir kullanıma imkân verecektir. Böylece veri tabanlarında özgeçmişlerin modellenmesi; geçmiş bir tarihteki veri tabanı içeriğini veya bir objenin zaman içerisindeki değişiminin görüntülenmesini sağlayacaktır. Verilerin özgeçmişlerinin takibi ve çok kullanıcılı güncelleme imkânının sağlanması için veri modeli seçimi çok önemli olmaktadır. Bu veri modeli veri tabanının hacmini artırmamalı ve verileri depolarken kompleks algoritmalar kullanmamalıdır. Özgeçmiş takibi ve çok kullanıcılı ortam için tek veri modelinin kullanılması çalışma hızını olumlu etkileyecektir. Bu amaca hizmet etmek için veri tabanları üzerinde mekânsal veri sunucusu olarak adlandırılan ve mekânsal verilerin depolanmasına imkân sağlayan yazılımlar geliştirilmiştir. Bu çalışmada veri tabanı yönetim sistemleri incelenerek çok kullanıcılı ortam ve verilerin özgeçmişlerinin izlenmesi için bu yazılımların sunmuş olduğu en uygun yöntemin belirlenmesi amaçlanmıştır. Anahtar Kelimeler: Veri tabanı yönetim sistemi, versiyon, özgeçmiş takibi.The increased use of database management systems in geographical information systems technologies directed user to benefit the other advantages of database management systems. One of these advantages is to support multi-user level and the other is tracking history of data.A key feature in any multi-user database is the ability to manage concurrent access to the data. Users may be required to edit the same data at the same time at any stage of geographical information of systems. So database management systems of geographical information systems must support multi-user concurrent editing without creating multiple copies of the data. Databases provide a way to support this with versioning. The primary role of versioning is to simplify the editing experience. Many geographical information system edits require more than just a few minutes to complete. Yet some editing tasks require hours, days, or even months to complete. Long transaction editing is supported by creating versions. Versioning lets user simultaneously create multiple, persistent representations of the database without making copies of the data. Versioning involves recording and managing changes to a multi-user database by creating a "version" of the database  an alternative, independent, persistent view of the database that does not involve creating a copy of the data and supports multiple concurrent editors. Multiple users can simultaneously edit the same features or rows without explicitly applying locks to prohibit other users from modifying the same data. Spatial information is commonly broken into components of space and attributes in geographical information systems and time component isn't included in systems. Including time component will bring geographical information system closer to the real world which it represents, and enable a more reliable use of the data spatial planning and decision making. Thus modeling history in a database allows visualizing the database?s contents at a particular time in the past or visualizing how a particular object has changed over time. Many organizations wish to manage how their geographical database changes over time. This process is commonly referred to as history. A historical database records information about database entities objects, features, and relationships) through time. The historical database can be queried at a point in time and get the correct set of entities that existed in the database at that time. Modeling history in a database allows for analysis, such as visualizing how a particular object changed over time. History has been traditionally achieved through s special archiving strategy or custom applications that store deleted features in a special history layer or maintain fields on tables with active dates for each row. The development of temporal data modeling in geographical information systems parallels to the progress of temporal data modeling in the database systems. The incorporation of temporal components has been implemented with relational model and then with the object-oriented data models in database systems. The versioning provides an alternative to these methods. A version can be used to represent the state of a database at a specific point in time. The versioning model will maintain the old representation of objects, deleted objects, and the time that these events happened without having to manage special layers or date stamps on features. A versioned database is a database that can have multiple persistent representations of its contents without the need for data replication. Versioning allows an organization to manage alternative engineering designs and solve complex 'what if' scenarios without impacting the corporate database and to create point-in-time representations of the database. A versioned database will contain a number of versions. A version is an alternative representation of the database that has an owner, description, level of access, and parent version. Versions are not affected by changes occurring in other versions of the database. Typically, a version will represent a work order, a design alternative, or a historical point in time of the database. It is very important to select a data model for tracking history of data and multi-user level. This data model mustn't increases the size of database and mustn't use complex storing algorithms to store spatial data. Many different applications were developed on database management systems to achieve this aim. One of these applications is developing software called spatial database engine which allows storing spatial data in database management systems. The aim of this study is to research the database management systems and spatial database engines and to decide on the most convenient method that they provide.   Keywords: Database management systems, versioning, tracking history

    Web GIS-Based Flood Management System for the Architectural Heritage

    Get PDF
    학위논문 (석사)-- 서울대학교 대학원 : 건축학과, 2015. 2. 이현수.In recent years, flood damage is drastically increasing due to global warming, urbanization, irregular weather condition and so on. Especially, flash flood by torrential rain and locally heavy rainfall damage the architectural heritage before an appropriate measure is taken. Multilateral efforts are put into solving the issue, however, weakness in effectively responding to the flood risk toward the cultural heritage buildings located all over the nation exists. To solve the problem, the Cultural Heritage Administration conducted researches from 2009 to 2012. Despite efforts, however, there are difficulties in actively corresponding to the flood disaster due to unclassified research data, low accessibility to the information and so on. To address these limitations, this research attempts to present an effective flood management system by integrating Web Geographic Information System (Web-GIS) with Relational Database Management System (RDBMS) and using real-time rainfall data. Compared to the traditional system, suggested Web GIS based flood management system is expected to be more efficient, adaptable and flexible. Ultimately, this research aims to be more supportive tool for flood risk managers decision making.Chapter 1 Introduction 1 1.1 Research Objective 1 1.2 Research Scope and Process 2 Chapter 2 Literature Review 5 2.1 Flood Risk Management for Architectural Heritage 5 2.2 Flash Flood and Response Time 8 2.3 Flood Risk Managers Decision Making 10 2.4 Summary 12 Chapter 3 Database for the System 13 3.1 Database Overview 13 3.2 Data Classification and DB Development 15 3.3 Entity Relationship Diagram Design 18 3.4 Summary 20 Chapter 4 Web Geographic Information System 21 Chapter 5 Flood Management System 23 5.1 System Requirements 23 5.2 System Architecture Design 25 5.3 System Interface and Function 27 5.4 Expression of Real Time Flood Risk 29 5.5 Provision of Reaction Manual 33 5.6 Summary 35 Chapter 6 System Usability Evaluation 36 6.1 Overview 36 6.2 Selection of Subjects 37 6.3 Evaluation Factor 38 6.4 Result and Analysis 39 Chapter 7 Conclusion 42 7.1 Research Results 42 7.2 Contributions 43 7.3 Limitations and Future Researches 44 Reference 45 Abstract (Korean) 50Maste

    Sensor web geoprocessing on the grid

    Get PDF
    Recent standardisation initiatives in the fields of grid computing and geospatial sensor middleware provide an exciting opportunity for the composition of large scale geospatial monitoring and prediction systems from existing components. Sensor middleware standards are paving the way for the emerging sensor web which is envisioned to make millions of geospatial sensors and their data publicly accessible by providing discovery, task and query functionality over the internet. In a similar fashion, concurrent development is taking place in the field of grid computing whereby the virtualisation of computational and data storage resources using middleware abstraction provides a framework to share computing resources. Sensor web and grid computing share a common vision of world-wide connectivity and in their current form they are both realised using web services as the underlying technological framework. The integration of sensor web and grid computing middleware using open standards is expected to facilitate interoperability and scalability in near real-time geoprocessing systems. The aim of this thesis is to develop an appropriate conceptual and practical framework in which open standards in grid computing, sensor web and geospatial web services can be combined as a technological basis for the monitoring and prediction of geospatial phenomena in the earth systems domain, to facilitate real-time decision support. The primary topic of interest is how real-time sensor data can be processed on a grid computing architecture. This is addressed by creating a simple typology of real-time geoprocessing operations with respect to grid computing architectures. A geoprocessing system exemplar of each geoprocessing operation in the typology is implemented using contemporary tools and techniques which provides a basis from which to validate the standards frameworks and highlight issues of scalability and interoperability. It was found that it is possible to combine standardised web services from each of these aforementioned domains despite issues of interoperability resulting from differences in web service style and security between specifications. A novel integration method for the continuous processing of a sensor observation stream is suggested in which a perpetual processing job is submitted as a single continuous compute job. Although this method was found to be successful two key challenges remain; a mechanism for consistently scheduling real-time jobs within an acceptable time-frame must be devised and the tradeoff between efficient grid resource utilisation and processing latency must be balanced. The lack of actual implementations of distributed geoprocessing systems built using sensor web and grid computing has hindered the development of standards, tools and frameworks in this area. This work provides a contribution to the small number of existing implementations in this field by identifying potential workflow bottlenecks in such systems and gaps in the existing specifications. Furthermore it sets out a typology of real-time geoprocessing operations that are anticipated to facilitate the development of real-time geoprocessing software.EThOS - Electronic Theses Online ServiceEngineering and Physical Sciences Research Council (EPSRC) : School of Civil Engineering & Geosciences, Newcastle UniversityGBUnited Kingdo

    Geographic information systems in business

    Get PDF
    1st edition, ©200

    Организация баз данных

    Get PDF
    Опис дисципліни. Дисципліна присвячена вивченню теоретичних основ, практичних методів і засобів побудови баз даних, а також питань, пов'язаних з життєвим циклом, підтримкою і супроводом баз даних. Розглядаються основні поняття баз даних, способи їх класифікації, принципи організації структур даних і відповідні їм типи систем управління базами даних (СУБД). Детально вивчається реляційна модель даних, теорія нормалізації та СУБД, що відповідають цій моделі (на прикладі СУБД MS SQL Server), стандартна мова запитів до реляційних СУБД - SQL, методи представлення складних структур даних засобами реляційної СУБД. Розглядаються питання організації колективного доступу до даних, вводяться поняття посилальної цілісності і семантичної цілісності даних, транзакцій і пов'язані з ними проблеми і методи їх вирішення. Розглядаються питання збереження і безпеки даних, методи резервного копіювання та стиснення даних. Дається огляд ієрархічних, нереляційних і постреляціонних, об'єктно-орієнтованих, повнотекстових, мережевих і розподілених СУБД. Вивчається побудова ER-моделі засобами Entity Framework Visual Studio, створення додатка для роботи з базами даних в середовищі розробки Visual Studio на мові С #.Анотація дисципліни «Організація баз даних». Метою викладання дисципліни є формування у студентів розуміння ролі автоматизованих банків даних в створенні інформаційних систем. Завданнями вивчення дисципліни є: вивчення моделей даних, які підтримуються різними системами управління базами даних (СУБД); вивчення нереляційних моделей; вивчення елементів теорії реляційних баз даних; знайомство з принципами побудови СУБД; вивчення розподілених СУБД і засобів розробки додатків для цих СУБД.Abstract "Database Organization" discipline. The purpose of teaching is to develop students' understanding the role of automated data banks in the creation of information systems. The objectives of the discipline are: study data models supported by different database management systems (DBMS); the study of non-relational models, the theory of relational databases, the principles of creating a database, the distributed database and application development tools for these databases.Аннотация дисциплины «Организация баз данных». Целью преподавания дисциплины является формирование у студентов понимания роли автоматизированных банков данных в создании информационных систем. Задачами изучения дисциплины являются: изучение моделей данных, поддерживаемых различными системами управления базами данных (СУБД); изучение нереляционных моделей; изучение элементов теории реляционных баз данных; знакомство с принципами построения СУБД; изучение распределенных СУБД и средств разработки приложений для этих СУБД
    corecore