47 research outputs found

    WikiSensing: A collaborative sensor management system with trust assessment for big data

    Get PDF
    Big Data for sensor networks and collaborative systems have become ever more important in the digital economy and is a focal point of technological interest while posing many noteworthy challenges. This research addresses some of the challenges in the areas of online collaboration and Big Data for sensor networks. This research demonstrates WikiSensing (www.wikisensing.org), a high performance, heterogeneous, collaborative data cloud for managing and analysis of real-time sensor data. The system is based on the Big Data architecture with comprehensive functionalities for smart city sensor data integration and analysis. The system is fully functional and served as the main data management platform for the 2013 UPLondon Hackathon. This system is unique as it introduced a novel methodology that incorporates online collaboration with sensor data. While there are other platforms available for sensor data management WikiSensing is one of the first platforms that enable online collaboration by providing services to store and query dynamic sensor information without any restriction of the type and format of sensor data. An emerging challenge of collaborative sensor systems is modelling and assessing the trustworthiness of sensors and their measurements. This is with direct relevance to WikiSensing as an open collaborative sensor data management system. Thus if the trustworthiness of the sensor data can be accurately assessed, WikiSensing will be more than just a collaborative data management system for sensor but also a platform that provides information to the users on the validity of its data. Hence this research presents a new generic framework for capturing and analysing sensor trustworthiness considering the different forms of evidence available to the user. It uses an extensible set of metrics that can represent such evidence and use Bayesian analysis to develop a trust classification model. Based on this work there are several publications and others are at the final stage of submission. Further improvement is also planned to make the platform serve as a cloud service accessible to any online user to build up a community of collaborators for smart city research.Open Acces

    Building a Visualiser for the Gameplay Design Patterns Wiki

    Get PDF
    This thesis presents a more useful visualization and filtering system for the Gameplay Design Patterns Wiki, the most extensive and up to date collection of gameplay design patterns. The aim of this research is to understand if an improvement can be made to the understanding of the wiki without changing the wiki’s content. To do this a study is conducted, resulting in the discovery of common game design and analysis tasks which require external information and these form the basis for requirements for the more useful visualization and filtering system. The system GDPVis is designed and implemented as an open-source in-browser single-page application which acts as a visualization and filtering system for information found in the Gameplay Design Patterns wiki. GDPVis incorporates node-link diagrams to display the relationships between patterns, and a node-based visual filtering system allowing for a fine level of control and granularity of the games or patterns collection being displayed. GDPVis is evaluated against the original wiki in two studies, which result in the system showing promise for being more useful (more time efficient to use, easier to use, more engaging to use, and more satisfying to use) in many specific scenarios, as well as generally for teaching students about gameplay design patterns

    The Analysis of Open Source Software and Data for Establishment of GIS Services Throughout the Network in a Mapping Organization at National or International Level

    Get PDF
    Federal agencies and their partners collect and manage large amounts of geospatial data but it is often not easily found when needed, and sometimes data is collected or purchased multiple times. In short, the best government data is not always organized and managed efficiently to support decision making in a timely and cost effective manner. National mapping agencies, various Departments responsible for collection of different types of Geospatial data and their authorities cannot, for very long, continue to operate, as they did a few years ago like people living in an island. Leaders need to look at what is now possible that was not possible before, considering capabilities such as cloud computing, crowd sourced data collection, available Open source remotely sensed data and multi source information vital in decision-making as well as new Web-accessible services that provide, sometimes at no cost. Many of these services previously could be obtained only from local GIS experts. These authorities need to consider the available solution and gather information about new capabilities, reconsider agency missions and goals, review and revise policies, make budget and human resource for decisions, and evaluate new products, cloud services, and cloud service providers. To do so, we need, choosing the right tools to rich the above-mentioned goals. As we know, Data collection is the most cost effective part of the mapping and establishment of a Geographic Information system. However, it is not only because of the cost for the data collection task but also because of the damages caused by the delay and the time that takes to provide the user with proper information necessary for making decision from the field up to the user’s hand. In fact, the time consumption of a project for data collection, processing, and presentation of geospatial information has more effect on the cost of a bigger project such as disaster management, construction, city planning, environment, etc. Of course, with such a pre-assumption that we provide all the necessary information from the existing sources directed to user’s computer. The best description for a good GIS project optimization or improvement is finding a methodology to reduce the time and cost, and increase data and service quality (meaning; Accuracy, updateness, completeness, consistency, suitability, information content, integrity, integration capability, and fitness for use as well as user’s specific needs and conditions that must be addressed with a special attention). Every one of the above-mentioned issues must be addressed individually and at the same time, the whole solution must be provided in a global manner considering all the criteria. In this thesis at first, we will discuss about the problem we are facing and what is needed to be done as establishment of National Spatial Data Infra-Structure (NSDI), the definition and related components. Then after, we will be looking for available Open Source Software solutions to cover the whole process to manage; Data collection, Data base management system, data processing and finally data services and presentation. The first distinction among Software is whether they are, Open source and free or commercial and proprietary. It is important to note that in order to make distinction among softwares it is necessary to define a clear specification for this categorization. It is somehow very difficult to distinguish what software belongs to which class from legal point of view and therefore, makes it necessary to clarify what is meant by various terms. With reference to this concept there are 2 global distinctions then, inside each group, we distinguish another classification regarding their functionalities and applications they are made for in GIScience. According to the outcome of the second chapter, which is the technical process for selection of suitable and reliable software according to the characteristics of the users need and required components, we will come to next chapter. In chapter 3, we elaborate in to the details of the GeoNode software as our best candidate tools to take responsibilities of those issues stated before. In Chapter 4, we will discuss the existing Open Source Data globally available with the predefined data quality criteria (Such as theme, data content, scale, licensing, and coverage) according to the metadata statement inside the datasets by mean of bibliographic review, technical documentation and web search engines. We will discuss in chapter 5 further data quality concepts and consequently define sets of protocol for evaluation of all datasets according to the tasks that a mapping organization in general, needed to be responsible to the probable users in different disciplines such as; Reconnaissance, City Planning, Topographic mapping, Transportation, Environment control, disaster management and etc… In Chapter 6, all the data quality assessment and protocols will be implemented into the pre-filtered, proposed datasets. In the final scores and ranking result, each datasets will have a value corresponding to their quality according to the sets of rules that are defined in previous chapter. In last steps, there will be a vector of weight that is derived from the questions that has to be answered by user with reference to the project in hand in order to finalize the most appropriate selection of Free and Open Source Data. This Data quality preference has to be defined by identifying a set of weight vector, and then they have to be applied to the quality matrix in order to get a final quality scores and ranking. At the end of this chapter there will be a section presenting data sets utilization in various projects such as “ Early Impact Analysis” as well as “Extreme Rainfall Detection System (ERDS)- version 2” performed by ITHACA. Finally, in conclusion, the important criteria, as well as future trend in GIS software are discussed and at the end recommendations will be presented

    Disaster Relief 2.0: The Future of Information Sharing in Humanitarian Emergencies

    Get PDF
    Outlines the challenges of and recommendations for creating an effective interface between humanitarian groups and volunteer and technical communities aggregating, visualizing, and analyzing data on and from affected communities to support relief efforts

    Adaptive object-modeling : patterns, tools and applications

    Get PDF
    Tese de Programa Doutoral. Informática. Universidade do Porto. Faculdade de Engenharia. 201

    The web-based simulation and information service for multi-hazard impact chains. Design document.

    Get PDF
    The overall objective of the PARATUS project and the platform is the co-development of a web-based simulation and information service for first and second responders and other stakeholders to evaluate the impact chains of multi-hazard events with particular emphasis on cross-border and cascading impacts. This deliverable provides a first impression of the platform and its components. A central theme in the PARATUS project is the co-development of the tools with stakeholders. The central stakeholders within the four applications case studies are therefore full project partners. They will be directly involved in the development of the platform. We foresee that the PARATUS Platform will have two major blocks: an information service that provides static information (or regularly updated information) and simulation service, which is a dynamic component where stakeholders can interactively work with the tools in the platform. The PARATUS will further make sure that documentation (e.g., software accompanying documentation) is also publicly available via the project website1 and other trusted repositories. The deliverable 4.1 was submitted to the European Commission on 31/07/2023 and is waiting for approval by the Research Executive Agency. Therefore, this current version may not represent the final version of the deliverable

    HIVE-MIND SPACE: A META-DESIGN APPROACH FOR CULTIVATING AND SUPPORTING COLLABORATIVE DESIGN.

    Get PDF
    The ever-growing complexity of design projects requires more knowledge than any individual can have and, therefore, needs the active engagement of all stakeholders in the design process. Collaborative design exploits synergies from multidisciplinary communities, encourages divergent thinking, and enhances social creativity. The research documented in this thesis supports and deepens the understanding of collaborative design in two dimensions: (1) It developed and evaluated socio-technical systems to support collaborative design projects; and (2) It defined and explored a meta- design framework focused on how these systems enable users, as active contributors, to modify and further develop them. The research is grounded in and simultaneously extends the following major dimensions of meta-design: (1) It exploits the contributions of social media and web 2.0 as innovative information technologies; (2) It facilitates the shift from consumer cultures to cultures of participation; (3) It fosters social creativity by harnessing contributions that occur in cultures of participation; (4) It empowers end-users to be active designers involved in creating situated solutions. In a world where change is the norm, meta-design is a necessity rather than a luxury because it is impossible to design software systems at design time for problems that occur only at use time. The co-evolution of systems and users\u2bc social practices pursued in this thesis requires a software environment that can evolve and be tailored continuously. End-user development explores tools and methods to support end users who tailor software artifacts. However, it addresses this objective primarily from a technical perspective and focuses mainly on tailorability. This thesis, centered on meta-design, extends end-user development by creating social conditions and design processes for broad participation in design activities both at design time and at use time. It builds on previous research into meta- design that has provided a strategic overview of design opportunities and principles. And it addresses some shortcomings of meta-design, such as the lack of guidelines for building concrete meta-design environments that can be assessed by empirical evaluation. Given the goal of this research, to explore meta-design approaches for cultivating and supporting collaborative design, the overarching research question guiding this work is: How do we provide a socio-technical environment to bring multidisciplinary design communities together to foster creativity, collaboration, and design evolution? 8 To answer this question, my research was carried out through four different phases: (1) synthesizing concepts, models, and theories; (2) framing conceptual models; (3) developing several systems in specific application areas; and (4) conducting empirical evaluation studies. The main contributions of this research are: \uf0a7 The Hive-Mind Space model, a meta-design framework derived from the \u201csoftware shaping workshop\u201d methodology and that integrates the \u201cseeding, evolutionary growth, reseeding\u201d model. The bottom-up approach inherent in this framework breaks down static social structures so as to support richer ecologies of participation. It provides the means for structuring communication and appropriation. The model\u2bcs open mediation mechanism tackles unanticipated communication gaps among different design communities. \uf0a7 MikiWiki, a structured programmable wiki I developed to demonstrate how the hive-mind space model can be implemented as a practical platform that benefits users and how its features and values can be specified so as to be empirically observable and assessable; \uf0a7 Empirical insights, such as those based on applying MikiWiki to different collaborative design studies, provide evidence that different phases of meta-design represent different modes rather than discrete levels
    corecore