8,059 research outputs found

    Innovative approaches to urban data management using emerging technologies

    Get PDF
    Many characteristics of Smart cities rely on a sufficient quantity and quality of urban data. Local industry and developers can use this data for application development that improves life of all citizens. Therefore, the handling and usability of this data is a big challenge for smart cities. In this paper we investigate new approaches to urban data management using emerging technologies and give an insight on further research conducted within the EC-funded smarticipate project. Geospatial data cannot be handled well in classical relational database environments. Either they are just put in as binary large objects or have to be broken down into elementary types which can be handled by the database, in many cases resulting in a slow system, since the database technology is not really tuned for delivery on mass data as classical relational databases are optimized for online transaction processing and not analytic processing. Document-based databases provide a better performance, but still struggle with the challenge of large binary objects. Also the heterogeneity of data requires a lot of mapping and data cleansing, in some cases replication can’t be avoided. Another approach is to use Semantic Web technologies to enhance the data and build up relations and connections between entities. However, data formats such as RDF use a different approach and are not suitable for geospatial data leading to a lack on usability. Search engines are a good example of web applications with a high usability. The users must be able to find the right data and get information of related or close matches. This allows information retrieval in an easy to use fashion. The same principles should be applied to geospatial data, which would improve the usability of open data. Combined with data mining and big data technologies those principles would improve the usability of open geospatial data and even lead to new ways to use it. By helping with the interpretation of data in a certain context data is transformed into useful information. In this paper we analyse key features of open geodata portals such as linked data and machine learning in order to show ways of improving the user experience. Based on the Smarticipate projects we show afterwards as open data and geo data online and see the practical application. We also give an outlook on piloting cases where we want to evaluate, how the technologies presented in this paper can be combined to a usefull open data portal. In contrast to the previous EC-funded project urbanapi, where participative processes in smart cities where created with urban data, we go one step further with semantic web and open data. Thereby we achieve a more general approach on open data portals for spatial data and how to improve their usability. The envisioned architecture of the smarticipate project relies on file based storage and a no-copy strategy, which means that data is mostly kept in its original format, a conversion to another format is only done if necessary (e.g. the current format has limitations on domain specific attributes or the user requests a specific format). A strictly functional approach and architecture is envisioned which allows a massively parallel execution and therefore is predestined to be deployed in a cloud environment. The actual search interface uses a domain specific vocabulary which can be customised for special purposes or for users that consider their context and expertise, which should abstract from technology specific peculiarities. Also application programmers will benefit form this architecture as linked data principles will be followed extensively. For example, the JSON and JSON-LD standards will be used, so that web developers can use results of the data store directly without the need for conversion. Also links to further information will be provided within the data, so that a drill down is possible for more details. The remainder of this paper is structured as follows. After the introduction about open data and data in general we look at related work and existing open data portals. This leads to the main chapter about the key technology aspects for an easy-to-use open data portal. This is followed by Chapter five, an introduction of the EC-funded project smarticipate, in which the key technology aspects of chapter four will be included

    Production of semi real time media-GIS contents using MODIS imagery

    Get PDF
    [Abstract]: Delivering environmental disaster information, swiftly, attractively, meaningfully, and accurately, to public is becoming a competitive task among spatial data visualizing experts. Basically, the data visualization process has to follow basics of spatial data visualization to maintain the academic quality and the spatial accuracy of the content. Here, “Media-GIS”, can be promoted as a one of the latest sub-forms of GIS, which targets mass media. Under Media-GIS, “Present” or the fist component of three roles of data visualization takes the major workload compare to other two, “Analysis” and “Explore”. When present contents, optimizing the main graphical variables like, size, value, texture, hue, orientation, and shape, is vital with regard to the target market (age group, social group) and the medium (print, TV, WEB, mobile). This study emphasizes on application of freely available MODIS true colour images to produce near real time contents on environmental disasters, while minimizing the production cost. With the brake of first news of a significant environmental disaster, relevant MODIS (250m) images can be extracted in GeoTIFF and KLM (Keyhole Markup Language) formats from MODIS website. This original KML file can be overlayed on Google Earth, to collect more spatial information of the disaster site. Then, in ArcGIS environment, GeoTIFF file can be transferred into Photoshop for production of the graphics of the target spot. This media-friendly Photoshop file can be used as an independent content without geo-references or imported into ArcGIS to convert into KLM format, which has geo-references. The KLM file, which is graphically enhanced content with extra information on environmental disaster, can be used in TV and WEB through Google Earth. Also, sub productions can be directed into print and mobile contents. If the data processing can be automated, system will be able to produce media contents in a faster manner. A case study on the recent undersea oil spill occurred in Gulf of Mexico included in the report to highlight main aspects discussed in the methodology

    From SpaceStat to CyberGIS: Twenty Years of Spatial Data Analysis Software

    Get PDF
    This essay assesses the evolution of the way in which spatial data analytical methods have been incorporated into software tools over the past two decades. It is part retrospective and prospective, going beyond a historical review to outline some ideas about important factors that drove the software development, such as methodological advances, the open source movement and the advent of the internet and cyberinfrastructure. The review highlights activities carried out by the author and his collaborators and uses SpaceStat, GeoDa, PySAL and recent spatial analytical web services developed at the ASU GeoDa Center as illustrative examples. It outlines a vision for a spatial econometrics workbench as an example of the incorporation of spatial analytical functionality in a cyberGIS.

    CLOUD-BASED SOLUTIONS IMPROVING TRANSPARENCY, OPENNESS AND EFFICIENCY OF OPEN GOVERNMENT DATA

    Get PDF
    A central pillar of open government programs is the disclosure of data held by public agencies using Information and Communication Technologies (ICT). This disclosure relies on the creation of open data portals (e.g. Data.gov) and has subsequently been associated with the expression Open Government Data (OGD). The overall goal of these governmental initiatives is not limited to enhance transparency of public sectors but aims to raise awareness of how released data can be put to use in order to enable the creation of new products and services by private sectors. Despite the usage of technological platforms to facilitate access to government data, open data portals continue to be organized in order to serve the goals of public agencies without opening the doors to public accountability, information transparency, public scrutiny, etc. This thesis considers the basic aspects of OGD including the definition of technical models for organizing such complex contexts, the identification of techniques for combining data from several portals and the proposal of user interfaces that focus on citizen-centred usability. In order to deal with the above issues, this thesis presents a holistic approach to OGD that aims to go beyond problems inherent their simple disclosure by providing a tentative answer to the following questions: 1) To what extent do the OGD-based applications contribute towards the creation of innovative, value-added services? 2) What technical solutions could increase the strength of this contribution? 3) Can Web 2.0 and Cloud technologies favour the development of OGD apps? 4) How should be designed a common framework for developing OGD apps that rely on multiple OGD portals and external web resources? In particular, this thesis is focused on devising computational environments that leverage the content of OGD portals (supporting the initial phase of data disclosure) for the creation of new services that add value to the original data. The thesis is organized as follows. In order to offer a general view about OGD, some important aspects about open data initiatives are presented including their state of art, the existing approaches for publishing and consuming OGD across web resources, and the factors shaping the value generated through government data portals. Then, an architectural framework is proposed that gathers OGD from multiple sites and supports the development of cloud-based apps that leverage these data according to potentially different exploitation roots ranging from traditional business to specialized supports for citizens. The proposed framework is validated by two cloud-based apps, namely ODMap (Open Data Mapping) and NESSIE (A Network-based Environment Supporting Spatial Information Exploration). In particular, ODMap supports citizens in searching and accessing OGD from several web sites. NESSIE organizes data captured from real estate agencies and public agencies (i.e. municipalities, cadastral offices and chambers of commerce) in order to provide citizens with a geographic representation of real estate offers and relevant statistics about the price trend.A central pillar of open government programs is the disclosure of data held by public agencies using Information and Communication Technologies (ICT). This disclosure relies on the creation of open data portals (e.g. Data.gov) and has subsequently been associated with the expression Open Government Data (OGD). The overall goal of these governmental initiatives is not limited to enhance transparency of public sectors but aims to raise awareness of how released data can be put to use in order to enable the creation of new products and services by private sectors. Despite the usage of technological platforms to facilitate access to government data, open data portals continue to be organized in order to serve the goals of public agencies without opening the doors to public accountability, information transparency, public scrutiny, etc. This thesis considers the basic aspects of OGD including the definition of technical models for organizing such complex contexts, the identification of techniques for combining data from several portals and the proposal of user interfaces that focus on citizen-centred usability. In order to deal with the above issues, this thesis presents a holistic approach to OGD that aims to go beyond problems inherent their simple disclosure by providing a tentative answer to the following questions: 1) To what extent do the OGD-based applications contribute towards the creation of innovative, value-added services? 2) What technical solutions could increase the strength of this contribution? 3) Can Web 2.0 and Cloud technologies favour the development of OGD apps? 4) How should be designed a common framework for developing OGD apps that rely on multiple OGD portals and external web resources? In particular, this thesis is focused on devising computational environments that leverage the content of OGD portals (supporting the initial phase of data disclosure) for the creation of new services that add value to the original data. The thesis is organized as follows. In order to offer a general view about OGD, some important aspects about open data initiatives are presented including their state of art, the existing approaches for publishing and consuming OGD across web resources, and the factors shaping the value generated through government data portals. Then, an architectural framework is proposed that gathers OGD from multiple sites and supports the development of cloud-based apps that leverage these data according to potentially different exploitation roots ranging from traditional business to specialized supports for citizens. The proposed framework is validated by two cloud-based apps, namely ODMap (Open Data Mapping) and NESSIE (A Network-based Environment Supporting Spatial Information Exploration). In particular, ODMap supports citizens in searching and accessing OGD from several web sites. NESSIE organizes data captured from real estate agencies and public agencies (i.e. municipalities, cadastral offices and chambers of commerce) in order to provide citizens with a geographic representation of real estate offers and relevant statistics about the price trend

    Web Data Extraction, Applications and Techniques: A Survey

    Full text link
    Web Data Extraction is an important problem that has been studied by means of different scientific tools and in a broad range of applications. Many approaches to extracting data from the Web have been designed to solve specific problems and operate in ad-hoc domains. Other approaches, instead, heavily reuse techniques and algorithms developed in the field of Information Extraction. This survey aims at providing a structured and comprehensive overview of the literature in the field of Web Data Extraction. We provided a simple classification framework in which existing Web Data Extraction applications are grouped into two main classes, namely applications at the Enterprise level and at the Social Web level. At the Enterprise level, Web Data Extraction techniques emerge as a key tool to perform data analysis in Business and Competitive Intelligence systems as well as for business process re-engineering. At the Social Web level, Web Data Extraction techniques allow to gather a large amount of structured data continuously generated and disseminated by Web 2.0, Social Media and Online Social Network users and this offers unprecedented opportunities to analyze human behavior at a very large scale. We discuss also the potential of cross-fertilization, i.e., on the possibility of re-using Web Data Extraction techniques originally designed to work in a given domain, in other domains.Comment: Knowledge-based System

    Tracking digital impact (TDI) tool.

    Get PDF
    The Tracking Digital Impact (TDI) tool is designed to help researchers, research groups, projects and institutions assess their current and future digital engagement strategies in an objective and informed way to support the development of new and improved strategies that more effectively enable good engagement with businesses, communities, the public, governing bodies and other researchers to facilitate better engagement and greater impact. The TDI tool was developed as part of a JISC funded project which focused on identifying, synthesising and embedding business, community and public (BCE) engagement best practices. The TDI tool examined the best practices at the dot.rural Digital Economies hub at the University of Aberdeen and translated those (accompanied by new guidance) into the TDI tool. Parts of this document were sourced from 'Brief Notes on Social Media for Research' by Jennifer Holden (University of Aberdeen, October, 2012). This document describes the TDI tool and its use

    A More Decentralized Vision for Linked Data

    Get PDF
    In this deliberately provocative position paper, we claim that ten years into Linked Data there are still (too?) many unresolved challenges towards arriving at a truly machine-readable and decentralized Web of data. We take a deeper look at the biomedical domain - currently, one of the most promising "adopters" of Linked Data - if we believe the ever-present "LOD cloud" diagram. Herein, we try to highlight and exemplify key technical and non-technical challenges to the success of LOD, and we outline potential solution strategies. We hope that this paper will serve as a discussion basis for a fresh start towards more actionable, truly decentralized Linked Data, and as a call to the community to join forces.Series: Working Papers on Information Systems, Information Business and Operation
    corecore