4,037 research outputs found

    Weak nodes detection in urban transport systems: Planning for resilience in Singapore

    Full text link
    The availability of massive data-sets describing human mobility offers the possibility to design simulation tools to monitor and improve the resilience of transport systems in response to traumatic events such as natural and man-made disasters (e.g. floods terroristic attacks, etc...). In this perspective, we propose ACHILLES, an application to model people's movements in a given transport system mode through a multiplex network representation based on mobility data. ACHILLES is a web-based application which provides an easy-to-use interface to explore the mobility fluxes and the connectivity of every urban zone in a city, as well as to visualize changes in the transport system resulting from the addition or removal of transport modes, urban zones, and single stops. Notably, our application allows the user to assess the overall resilience of the transport network by identifying its weakest node, i.e. Urban Achilles Heel, with reference to the ancient Greek mythology. To demonstrate the impact of ACHILLES for humanitarian aid we consider its application to a real-world scenario by exploring human mobility in Singapore in response to flood prevention.Comment: 9 pages, 6 figures, IEEE Data Science and Advanced Analytic

    Leveraging Domain Knowledge for Inclusive and Bias-aware Humanitarian Response Entry Classification

    Full text link
    Accurate and rapid situation analysis during humanitarian crises is critical to delivering humanitarian aid efficiently and is fundamental to humanitarian imperatives and the Leave No One Behind (LNOB) principle. This data analysis can highly benefit from language processing systems, e.g., by classifying the text data according to a humanitarian ontology. However, approaching this by simply fine-tuning a generic large language model (LLM) involves considerable practical and ethical issues, particularly the lack of effectiveness on data-sparse and complex subdomains, and the encoding of societal biases and unwanted associations. In this work, we aim to provide an effective and ethically-aware system for humanitarian data analysis. We approach this by (1) introducing a novel architecture adjusted to the humanitarian analysis framework, (2) creating and releasing a novel humanitarian-specific LLM called HumBert, and (3) proposing a systematic way to measure and mitigate biases. Our experiments' results show the better performance of our approach on zero-shot and full-training settings in comparison with strong baseline models, while also revealing the existence of biases in the resulting LLMs. Utilizing a targeted counterfactual data augmentation approach, we significantly reduce these biases without compromising performance.Comment: Accepted at IJCAI 202

    No exit: next steps to help promote South Pacific peace and prosperity

    Get PDF
    This paper explores contemporary official and scholarly thinking on aid, development, conflict prevention and strategic shaping to try to identify promising avenues to promote regional growth and stability in a tight budget environment. Overview As Australia focuses on its global interests in a changing and challenging international environment, there’s a danger that we’ll lose sight of important constants of history and geography. We don’t have an either/or choice to focus on near or distant security imperatives. While the Australian Government’s decision to lift defence funding will help with this, cutting aid to help offset that boost may prove counterproductive. We also need to further improve the quality of our aid and regional diplomacy, as well as the hard and soft aspects of our security engagement. This paper suggests some useful first steps for doing so

    Information propagation in social networks during crises: A structural framework

    Get PDF
    In crisis situations like riots, earthquakes, storms, etc. information plays a central role in the process of organizing interventions and decision making. Due to their increasing use during crises, social media (SM) represents a valuable source of information that could help obtain a full picture of people needs and concerns. In this chapter, we highlight the importance of SM networks in crisis management (CM) to show how information is propagated through. The chapter also summarizes the current state of research related to information propagation in SMnetworks during crises. In particular three classes of information propagation research categories are identified: network analysis and community detection, role and topic-oriented information propagation, and infrastructure-oriented information propagation. The chapter describes an analysis framework that deals with structural information propagation for crisismanagement purposes. Structural propagation is about broadcasting specific information obtained from social media networks to targeted sinks/receivers/hubs like emergency agencies, police department, fire department, etc. Specifically, the framework aims to identify the discussion topics, known as sub-events, related to a crisis (event) from SM contents. A brief description of techniques used to detect topics and the way those topics can be used in structural information propagation are presented

    Development of a national-scale real-time Twitter data mining pipeline for social geodata on the potential impacts of flooding on communities

    Get PDF
    International audienceSocial media, particularly Twitter, is increasingly used to improve resilience during extreme weather events/emergency management situations, including floods: by communicating potential risks and their impacts, and informing agencies and responders. In this paper, we developed a prototype national-scale Twitter data mining pipeline for improved stakeholder situational awareness during flooding events across Great Britain, by retrieving relevant social geodata, grounded in environmental data sources (flood warnings and river levels). With potential users we identified and addressed three research questions to develop this application, whose components constitute a modular architecture for real-time dashboards. First, polling national flood warning and river level Web data sources to obtain at-risk locations. Secondly, real-time retrieval of geotagged tweets, proximate to at-risk areas. Thirdly, filtering flood-relevant tweets with natural language processing and machine learning libraries, using word embeddings of tweets. We demonstrated the national-scale social geodata pipeline using over 420,000 georeferenced tweets obtained between 20-29th June 2016. Highlights • Prototype real-time social geodata pipeline for flood events and demonstration dataset • National-scale flood warnings/river levels set 'at-risk areas' in Twitter API queries • Monitoring multiple locations (without keywords) retrieved current, geotagged tweets • Novel application of word embeddings in flooding context identified relevant tweets • Pipeline extracts tweets to visualise using open-source libraries (SciKit Learn/Gensim) Keywords Flood management; Twitter; volunteered geographic information; natural language processing; word embeddings; social geodata. Hardware required: Intel i3 or mid-performance PC with multicore processor and SSD main drive, 8Gb memory recommended. Software required: Python and library dependencies specified in Appendix A1.2.1, (viii) environment.yml Software availability: All source code can be found at GitHub public repositorie
    • …
    corecore