4,329 research outputs found

    Crisis Communication Patterns in Social Media during Hurricane Sandy

    Full text link
    Hurricane Sandy was one of the deadliest and costliest of hurricanes over the past few decades. Many states experienced significant power outage, however many people used social media to communicate while having limited or no access to traditional information sources. In this study, we explored the evolution of various communication patterns using machine learning techniques and determined user concerns that emerged over the course of Hurricane Sandy. The original data included ~52M tweets coming from ~13M users between October 14, 2012 and November 12, 2012. We run topic model on ~763K tweets from top 4,029 most frequent users who tweeted about Sandy at least 100 times. We identified 250 well-defined communication patterns based on perplexity. Conversations of most frequent and relevant users indicate the evolution of numerous storm-phase (warning, response, and recovery) specific topics. People were also concerned about storm location and time, media coverage, and activities of political leaders and celebrities. We also present each relevant keyword that contributed to one particular pattern of user concerns. Such keywords would be particularly meaningful in targeted information spreading and effective crisis communication in similar major disasters. Each of these words can also be helpful for efficient hash-tagging to reach target audience as needed via social media. The pattern recognition approach of this study can be used in identifying real time user needs in future crises

    Explicit diversification of event aspects for temporal summarization

    Get PDF
    During major events, such as emergencies and disasters, a large volume of information is reported on newswire and social media platforms. Temporal summarization (TS) approaches are used to automatically produce concise overviews of such events by extracting text snippets from related articles over time. Current TS approaches rely on a combination of event relevance and textual novelty for snippet selection. However, for events that span multiple days, textual novelty is often a poor criterion for selecting snippets, since many snippets are textually unique but are semantically redundant or non-informative. In this article, we propose a framework for the diversification of snippets using explicit event aspects, building on recent works in search result diversification. In particular, we first propose two techniques to identify explicit aspects that a user might want to see covered in a summary for different types of event. We then extend a state-of-the-art explicit diversification framework to maximize the coverage of these aspects when selecting summary snippets for unseen events. Through experimentation over the TREC TS 2013, 2014, and 2015 datasets, we show that explicit diversification for temporal summarization significantly outperforms classical novelty-based diversification, as the use of explicit event aspects reduces the amount of redundant and off-topic snippets returned, while also increasing summary timeliness

    A Bayesian-Based Approach for Public Sentiment Modeling

    Full text link
    Public sentiment is a direct public-centric indicator for the success of effective action planning. Despite its importance, systematic modeling of public sentiment remains untapped in previous studies. This research aims to develop a Bayesian-based approach for quantitative public sentiment modeling, which is capable of incorporating uncertainty and guiding the selection of public sentiment measures. This study comprises three steps: (1) quantifying prior sentiment information and new sentiment observations with Dirichlet distribution and multinomial distribution respectively; (2) deriving the posterior distribution of sentiment probabilities through incorporating the Dirichlet distribution and multinomial distribution via Bayesian inference; and (3) measuring public sentiment through aggregating sampled sets of sentiment probabilities with an application-based measure. A case study on Hurricane Harvey is provided to demonstrate the feasibility and applicability of the proposed approach. The developed approach also has the potential to be generalized to model various types of probability-based measures

    Quantifying human mobility resilience to extreme events using geo-located social media data

    No full text
    • …
    corecore