6,641 research outputs found

    Geospatial crowdsourced data fitness analysis for spatial data infrastructure based disaster management actions

    Get PDF
    The reporting of disasters has changed from official media reports to citizen reporters who are at the disaster scene. This kind of crowd based reporting, related to disasters or any other events, is often identified as 'Crowdsourced Data' (CSD). CSD are freely and widely available thanks to the current technological advancements. The quality of CSD is often problematic as it is often created by the citizens of varying skills and backgrounds. CSD is considered unstructured in general, and its quality remains poorly defined. Moreover, the CSD's location availability and the quality of any available locations may be incomplete. The traditional data quality assessment methods and parameters are also often incompatible with the unstructured nature of CSD due to its undocumented nature and missing metadata. Although other research has identified credibility and relevance as possible CSD quality assessment indicators, the available assessment methods for these indicators are still immature. In the 2011 Australian floods, the citizens and disaster management administrators used the Ushahidi Crowd-mapping platform and the Twitter social media platform to extensively communicate flood related information including hazards, evacuations, help services, road closures and property damage. This research designed a CSD quality assessment framework and tested the quality of the 2011 Australian floods' Ushahidi Crowdmap and Twitter data. In particular, it explored a number of aspects namely, location availability and location quality assessment, semantic extraction of hidden location toponyms and the analysis of the credibility and relevance of reports. This research was conducted based on a Design Science (DS) research method which is often utilised in Information Science (IS) based research. Location availability of the Ushahidi Crowdmap and the Twitter data assessed the quality of available locations by comparing three different datasets i.e. Google Maps, OpenStreetMap (OSM) and Queensland Department of Natural Resources and Mines' (QDNRM) road data. Missing locations were semantically extracted using Natural Language Processing (NLP) and gazetteer lookup techniques. The Credibility of Ushahidi Crowdmap dataset was assessed using a naive Bayesian Network (BN) model commonly utilised in spam email detection. CSD relevance was assessed by adapting Geographic Information Retrieval (GIR) relevance assessment techniques which are also utilised in the IT sector. Thematic and geographic relevance were assessed using Term Frequency – Inverse Document Frequency Vector Space Model (TF-IDF VSM) and NLP based on semantic gazetteers. Results of the CSD location comparison showed that the combined use of non-authoritative and authoritative data improved location determination. The semantic location analysis results indicated some improvements of the location availability of the tweets and Crowdmap data; however, the quality of new locations was still uncertain. The results of the credibility analysis revealed that the spam email detection approaches are feasible for CSD credibility detection. However, it was critical to train the model in a controlled environment using structured training including modified training samples. The use of GIR techniques for CSD relevance analysis provided promising results. A separate relevance ranked list of the same CSD data was prepared through manual analysis. The results revealed that the two lists generally agreed which indicated the system's potential to analyse relevance in a similar way to humans. This research showed that the CSD fitness analysis can potentially improve the accuracy, reliability and currency of CSD and may be utilised to fill information gaps available in authoritative sources. The integrated and autonomous CSD qualification framework presented provides a guide for flood disaster first responders and could be adapted to support other forms of emergencies

    Cairn and landscape interaction: a study of human ecodynamics in the North Atlantic

    Get PDF
    Cairns are very versatile constructions, used for a variety of human activities. For this reason, this master’s thesis focuses on the analysis of navigational and agricultural cairns as a way to further the study and understanding of human ecodynamics in the North Atlantic. The project was developed within the wider framework of DataARC, a ciberinfrastructure that interconnects several North Atlantic datasets while facilitating access to them. This master’s thesis aims to connect its results and datasets to cyberinfrastructure’s concept map. The study uses a variety of interdisciplinary data, methodologies, tools, and approaches in its analysis. It is divided into two case studies. Case Study One analyses the role of navigational cairns and their relationship with the roads and the general landscape while creating proposed routes between late medieval/early modern farm networks in Iceland. The methods used in this case study are least cost path and intervisibility analysis. The results support the hypothesis that cairns were being used as road markers, and suggests a dependence between landscape features and cairn positioning, indicating landscape agency. Case Study Two focuses on analysing the position and effect of clearance cairns in the Scottish landscape. Archaeological and landscape data (i.e. soil quality, land compaction, erosion levels, and continued use in agricultural practices) were combined in a newly created dataset that was examined with statistical analysis. The results show a correlation between the presence of cairns and a high erosion risk while proposing three effects of clearance cairns on soils: so exhaustive they are no longer cultivable, they are cultivable but have a risk of erosion, and so non-exhaustive that they can still be cultivated. Overall, the project’s findings highlight cairns’ usefulness in the study of human ecodynamics. In both case studies, cairns are the physical representation of an active and ever-changing interaction between humans and their environment, and the relevance and agency of landscape. These are based on Case Study One’s feature-dependant cairns, and Case Study Two’s observation of human adaptation to the landscape’s changes. This opens the possibility of an expansion of the area of study, including other areas and continents, and/or a focus on developing studies that focus on cairn multifunctionality

    A Framework for Understanding and Analysing eBusiness Models

    Get PDF

    Social Knowledge Environments

    Get PDF
    Knowledge management represents a key issue for both information systems’ academics and practitioners, including those who have become disillusioned by actual results that fail to deliver on exaggerated promises and idealistic visions. Social software, a tremendous global success story, has prompted similarly high expectations regarding the ways in which organizations can improve their knowledge handling. But can these expectations be met, whether in academic research or the real world? This article seeks to identify current research trends and gaps, with a focus on social knowledge environments. The proposed research agenda features four focal challenges: semi-permeable organizations, social software in professional work settings, crowd knowledge, and crossborder knowledge management. Three solutions emerge as likely methods to address these challenges: designoriented solutions, analytical solutions, and interdisciplinary dialogue

    The biology of sexual development of Plasmodium: the design and implementation of transmission-blocking strategies

    Get PDF
    A meeting to discuss the latest developments in the biology of sexual development of Plasmodium and transmission-control was held April 5-6, 2011, in Bethesda, MD. The meeting was sponsored by the Bill & Melinda Gates Foundation and the National Institutes of Health, National Institute of Allergy and Infectious Diseases (NIH/NIAID) in response to the challenge issued at the Malaria Forum in October 2007 that the malaria community should re-engage with the objective of global eradication. The consequent rebalancing of research priorities has brought to the forefront of the research agenda the essential need to reduce parasite transmission. A key component of any transmission reduction strategy must be methods to attack the parasite as it passes from man to the mosquito (and vice versa). Such methods must be rationally based on a secure understanding of transmission from the molecular-, cellular-, population- to the evolutionary-levels. The meeting represented a first attempt to draw together scientists with expertise in these multiple layers of understanding to discuss the scientific foundations and resources that will be required to provide secure progress toward the design and successful implementation of effective interventions

    Graphical Database Architecture For Clinical Trials

    Get PDF
    The general area of the research is Health Informatics. The research focuses on creating an innovative and novel solution to manage and analyze clinical trials data. It constructs a Graphical Database Architecture (GDA) for Clinical Trials (CT) using New Technology for Java (Neo4j) as a robust, a scalable and a high-performance database. The purpose of the research project is to develop concepts and techniques based on architecture to accelerate the processing time of clinical data navigation at lower cost. The research design uses a positivist approach to empirical research. The research is significant because it proposes a new approach of clinical trials through graph theory and designs a responsive structure of clinical data that can be deployed across all the health informatics landscape. It uniquely contributes to scholarly literature of the phenomena of Not only SQL (NoSQL) graph databases, mainly Neo4j in CT, for future research of clinical informatics. A prototype is created and examined to validate the concepts, taking advantage of Neo4j’s high availability, scalability, and powerful graph query language (Cypher). This research study finds that integration of search methodologies and information retrieval with the graphical database provides a solid starting point to manage, query, and analyze the clinical trials data, furthermore the design and the development of a prototype demonstrate the conceptual model of this study. Likewise the proposed clinical trials ontology (CTO) incorporates all data elements of a standard clinical study which facilitate a heuristic overview of treatments, interventions, and outcome results of these studies
    corecore