84 research outputs found

    An Agent-Based Variogram Modeller: Investigating Intelligent, Distributed-Component Geographical Information Systems

    Get PDF
    Geo-Information Science (GIScience) is the field of study that addresses substantive questions concerning the handling, analysis and visualisation of spatial data. Geo- Information Systems (GIS), including software, data acquisition and organisational arrangements, are the key technologies underpinning GIScience. A GIS is normally tailored to the service it is supposed to perform. However, there is often the need to do a function that might not be supported by the GIS tool being used. The normal solution in these circumstances is to go out and look for another tool that can do the service, and often an expert to use that tool. This is expensive, time consuming and certainly stressful to the geographical data analyses. On the other hand, GIS is often used in conjunction with other technologies to form a geocomputational environment. One of the complex tools in geocomputation is geostatistics. One of its functions is to provide the means to determine the extent of spatial dependencies within geographical data and processes. Spatial datasets are often large and complex. Currently Agent system are being integrated into GIS to offer flexibility and allow better data analysis. The theis will look into the current application of Agents in within the GIS community, determine if they are used to representing data, process or act a service. The thesis looks into proving the applicability of an agent-oriented paradigm as a service based GIS, having the possibility of providing greater interoperability and reducing resource requirements (human and tools). In particular, analysis was undertaken to determine the need to introduce enhanced features to agents, in order to maximise their effectiveness in GIS. This was achieved by addressing the software agent complexity in design and implementation for the GIS environment and by suggesting possible solutions to encountered problems. The software agent characteristics and features (which include the dynamic binding of plans to software agents in order to tackle the levels of complexity and range of contexts) were examined, as well as discussing current GIScience and the applications of agent technology to GIS, agents as entities, objects and processes. These concepts and their functionalities to GIS are then analysed and discussed. The extent of agent functionality, analysis of the gaps and the use these technologies to express a distributed service providing an agent-based GIS framework is then presented. Thus, a general agent-based framework for GIS and a novel agent-based architecture for a specific part of GIS, the variogram, to examine the applicability of the agent- oriented paradigm to GIS, was devised. An examination of the current mechanisms for constructing variograms, underlying processes and functions was undertaken, then these processes were embedded into a novel agent architecture for GIS. Once the successful software agent implementation had been achieved, the corresponding tool was tested and validated - internally for code errors and externally to determine its functional requirements and whether it enhances the GIS process of dealing with data. Thereafter, its compared with other known service based GIS agents and its advantages and disadvantages analysed

    Developing Error Handling Software for Object-Oriented Geographical Information

    Get PDF
    The inclusion of error handling capabilities within geographical information systems (GIS) is seen by many as crucial to the future commercial and legal stability of the technology. This thesis describes the analysis, design, implementation and use of a GIS able to handle both geographical information (GI) and the error associated with that GI. The first stage of this process is the development of an error-sensitive GIS, able to provide core error handling functionality in a form flexible enough to be widely applicable to error-prone GI. Object-oriented (OO) analysis, design and programming techniques, supported by recent developments in formal OO theory, are used to implement an error-sensitive GIS within Laser-Scan Gothic OOGIS software. The combination of formal theory and GIS software implementation suggests that error-sensitive GIS are a practical possibility using OO technology. While the error-sensitive GIS is an important step toward full error handling systems, it is expected that most GIS users would require additional high level functionality before use of error- sensitive GIS could become commonplace. There is a clear need to provide error handling systems that actively assist non-expert users in assessing, using and understanding error in GI. To address this need, an error-aware GIS offering intelligent domain specific error handling software tools was developed, based on the core error-sensitive functionality. In order to provide a stable software bridge between the flexible error-sensitive GIS and specialised error-aware software tools, the error-aware GIS makes use of a distributed systems component architecture. The component architecture allows error-aware software tools that extend core error-sensitive functionality to be developed with minimal time and cost overheads. Based on a telecommunications application in Kingston-upon-Hull, UK, three error-aware tools were developed to address particular needs identified within the application. First, an intelligent hypertext system in combination with a conventional expert system was used to assist GIS users with error-sensitive database design. Second, an inductive learning algorithm was used to automatically populate the error-sensitive database with information about error, based on a small pilot error assessment. Finally, a visualisation and data integration tool was developed to allow access to the error-sensitive database and error propagation routines to users across the Internet. While a number of important avenues of further work are implied by this research, the results of this research provide a blueprint for the development of practical error handling capabilities within GIS. The architecture used is both robust and flexible, and arguably represents a framework both for future research and for the development of commercial error handling GIS

    Geomatics for Mobility Management. A comprehensive database model for Mobility Management

    Get PDF
    In urban and metropolitan context, Traffic Operations Centres (TOCs) use technologies as Geographic Information Systems (GIS) and Intelligent Transport Systems (ITS) to tackling urban mobility issue. Usually in TOCs, various isolated systems are maintained in parallel (stored in different databases), and data comes from different sources: a challenge in transport management is to transfer disparate data into a unified data management system that preserves access to legacy data, allowing multi-thematic analysis. This need of integration between systems is important for a wise policy decisions. This study aims to design a comprehensive and general spatial data model that could allow the integration and visualization of traffic components and measures. The activity is focused on the case study of 5T Agency in Turin, a TOC that manages traffic regulation, public transit fleets and information to users, in the metropolitan area of Turin and Piedmont Region. In particular, the agency has set up during years a wide system of ITS technologies that acquires continuously measures and traffic information, which are used to deploy information services to citizens and public administrations. However, the spatial nature of these data is not fully considered in the daily operational activity, with the result of difficulties in information integration. Indeed the agency lacks of a complete GIS that includes all the management information in an organized spatial and “horizontal” vision. The main research question concerns the integration of different kind of data in a unique GIS spatial data model. Spatial data interoperability is critical and particularly challenging because geographic data definition in legacy database can vary widely: different data format and standards, data inconsistencies, different spatial and temporal granularities, different methods and enforcing rules that relates measures, events and physical infrastructures. The idea is not to replace the existing implemented and efficient system, but to built-up on these systems a GIS that overpass the different software and DBMS platforms and that can demonstrate how a spatial and horizontal vision in tackling urban mobility issues may be useful for policy and strategies decisions. The modelling activity take reference from a transport standards review and results in database general schema, which can be reused by other TOCs in their activities, helping the integration and coordination between different TOCs. The final output of the research is an ArcGIS geodatabase, tailored on 5T data requirements, which enable the customised representation of private traffic elements and measures. Specific custom scripts have been developed to allow the extraction and the temporal aggregation of traffic measures and events. The solution proposed allows the reuse of data and measures for custom purposes, without the need to deeply know the entire ITS environment system. In addition, The proposed ArcGIS geodatabase solution is optimised for limited power-computing environment. A case study has been deepened in order to evaluate the suitability of the database: a confrontation between damages, detected by Emergency Mapping Services (EMS), and Traffic Message Channel traffic events, has been conducted, evaluating the utility of 5T historical information of traffic events of the Piedmont floods of November 2016 for EMS services

    A web-based graphical user interface to display spatial data

    Get PDF
    This dissertation presents the design and implementation of a graphical user interface (GUI) to display spatial data in a web-based environment. The work is a case study for a web-based framework for distributed applications, the Web Computing Skeleton, using a distributed open spatial query mechanism to display the geographic data. The design is based on investigation of geographic information systems (GISs), GUI design and properties of spatial query mechanisms. The purpose ofthe GUI is to integrate information about a geographic area; display, manipulate and query geographic-based spatial data; execute queries about spatial relationships and analyse the attribute data to calculate the shortest routes for emergency response. The GUI is implemented as a Java applet embedded in a web document that communicates with the application server via generic GIS classes that provide a common interface to various GIS data sources used in the spatial query mechanism to access a geographic database. Features that are supported by the distributed open spatial query mechanism include a basic set of spatial selection criteria, spatial selection based on pointing, specification of a query window, description of a map scale and identification of a map legend. The design is based on a formal design process that includes the selection of a conceptual model, identification of task flow, major windows and dialog flow, the definition of fields and detailed window layout and finally the definition of field constraints and defaults. The conceptual model characterises the application and provides a framework for users to learn the system model. This model is conceptualised as a map that the user manipulates directly. Unlike a typical map, which just shows spatial data such as roads, cities, and country borders, the GIS links attribute data like population statistics to the spatial data. This link between the map data and the attribute data makes the GIS a powerful tool to manipulate and display data. To measure the performance of displaying spatial data, two main factors are considered, namely processing speed and display quality. Factors that affect the processing speed include the rate of data transfer from the generic GIS classes, the rate data is downloaded over the network and the speed of execution of the drawing. Two factors that influence the spatial data display quality are pixel distance and bitmap quality. The pixel distance set in the geographic database is represented by two pixels on the display screen, which affects the display quality since the pixel distance is the upper limit for display granularity. This means that setting the pixel distance is a trade-off between the processing speed and the display quality. Bitmaps are raster images that are made up of pixels or cells. To improve the raster image quality, the bitmap resolution can be adjusted to display more pixels per centimetre.Dissertation (MSc (Computer Science))--University of Pretoria, 2007.Computer Scienceunrestricte

    Implementation of computer visualisation in UK planning

    Get PDF
    PhD ThesisWithin the processes of public consultation and development management, planners are required to consider spatial information, appreciate spatial transformations and future scenarios. In the past, conventional media such as maps, plans, illustrations, sections, and physical models have been used. Those traditional visualisations are at a high degree of abstraction, sometimes difficult to understand for lay people and inflexible in terms of the range of scenarios which can be considered. Yet due to technical advances and falling costs, the potential for computer based visualisation has much improved and has been increasingly adopted within the planning process. Despite the growth in this field, insufficient consideration has been given to the possible weakness of computerised visualisations. Reflecting this lack of research, this study critically evaluates the use and potential of computerised visualisation within this process. The research is divided into two components: case study analysis and reflections of the author following his involvement within the design and use of visualisations in a series of planning applications; and in-depth interviews with experienced practitioners in the field. Based on a critical review of existing literature, this research explores in particular the issues of credibility, realism and costs of production. The research findings illustrate the importance of the credibility of visualisations, a topic given insufficient consideration within the academic literature. Whereas the realism of visualisations has been the focus of much previous research, the results of the case studies and interviews with practitioners undertaken in this research suggest a ‘photo’ realistic level of details may not be required as long as the observer considers the visualisations to be a credible reflection of the underlying reality. Although visualisations will always be a simplification of reality and their level of realism is subjective, there is still potential for developing guidelines or protocols for image production based on commonly agreed standards. In the absence of such guidelines there is a danger that scepticism in the credibility of computer visualisations will prevent the approach being used to its full potential. These findings suggest there needs to be a balance between scientific protocols and artistic licence in the production of computer visualisation. In order to be sufficiently credible for use in decision making within the planning processes, the production of computer visualisation needs to follow a clear methodology and scientific protocols set out in good practice guidance published by professional bodies and governmental organisations.Newcastle upon Tyne for awarding me an International Scholarship and Alumni Bursar

    National freight transport planning: towards a Strategic Planning Extranet Decision Support System (SPEDSS)

    Get PDF
    This thesis provides a `proof-of-concept' prototype and a design architecture for a Object Oriented (00) database towards the development of a Decision Support System (DSS) for the national freight transport planning problem. Both governments and industry require a Strategic Planning Extranet Decision Support System (SPEDSS) for their effective management of the national Freight Transport Networks (FTN). This thesis addresses the three key problems for the development of a SPEDSS to facilitate national strategic freight planning: 1) scope and scale of data available and required; 2) scope and scale of existing models; and 3) construction of the software. The research approach taken embodies systems thinking and includes the use of: Object Oriented Analysis and Design (OOA/D) for problem encapsulation and database design; artificial neural network (and proposed rule extraction) for knowledge acquisition of the United States FTN data set; and an iterative Object Oriented (00) software design for the development of a `proof-of-concept' prototype. The research findings demonstrate that an 00 approach along with the use of 00 methodologies and technologies coupled with artificial neural networks (ANNs) offers a robust and flexible methodology for the analysis of the FTN problem domain and the design architecture of an Extranet based SPEDSS. The objectives of this research were to: 1) identify and analyse current problems and proposed solutions facing industry and governments in strategic transportation planning; 2) determine the functional requirements of an FTN SPEDSS; 3) perform a feasibility analysis for building a FTN SPEDSS `proof-of-concept' prototype and (00) database design; 4) develop a methodology for a national `internet-enabled' SPEDSS model and database; 5) construct a `proof-of-concept' prototype for a SPEDSS encapsulating identified user requirements; 6) develop a methodology to resolve the issue of the scale of data and data knowledge acquisition which would act as the `intelligence' within a SPDSS; 7) implement the data methodology using Artificial Neural Networks (ANNs) towards the validation of it; and 8) make recommendations for national freight transportation strategic planning and further research required to fulfil the needs of governments and industry. This thesis includes: an 00 database design for encapsulation of the FTN; an `internet-enabled' Dynamic Modelling Methodology (DMM) for the virtual modelling of the FTNs; a Unified Modelling Language (UML) `proof-of-concept' prototype; and conclusions and recommendations for further collaborative research are identified

    Proceedings of the Workshop on Parallel/High-Performance Object-Oriented Scientific Computing (POOSC '03)

    Get PDF

    Dynamics of disturbed Mexican pine-oak forest a modelling approach

    Get PDF

    Intelligent simulation of coastal ecosystems

    Get PDF
    Tese de doutoramento. Engenharia Informática. Faculdade de Engenharia. Universidade do Porto, Faculdade de Ciência e Tecnologia. Universidade Fernando Pessoa. 201
    corecore