22,056 research outputs found
Information spreading during emergencies and anomalous events
The most critical time for information to spread is in the aftermath of a
serious emergency, crisis, or disaster. Individuals affected by such situations
can now turn to an array of communication channels, from mobile phone calls and
text messages to social media posts, when alerting social ties. These channels
drastically improve the speed of information in a time-sensitive event, and
provide extant records of human dynamics during and afterward the event.
Retrospective analysis of such anomalous events provides researchers with a
class of "found experiments" that may be used to better understand social
spreading. In this chapter, we study information spreading due to a number of
emergency events, including the Boston Marathon Bombing and a plane crash at a
western European airport. We also contrast the different information which may
be gleaned by social media data compared with mobile phone data and we estimate
the rate of anomalous events in a mobile phone dataset using a proposed anomaly
detection method.Comment: 19 pages, 11 figure
Disasters: Issues for State and Federal Government Finances
Extreme events like hurricanes, earthquakes, or terrorist attacks present major challenges for fiscal systems at all levels of government. Analysts concerned with the fiscal and financial impacts of disasters must attempt to assess the likelihood of rare events of large magnitude such as Hurricane Katrina. Extreme value theory, applied here to flood damage data for Louisiana, offers one promising methodology for this purpose. The experience of Katrina and 9/11 also show that large disasters have large intergovernmental impacts. Individual states could, in principle, engage in more extensive ex ante financial and policy preparations for disasters, including disaster avoidance, but the ârevealed institutional structureâ exposed by recent experience shows that the US federal system shifts much of the economic incidence of local disasters to the rest of society through intergovernmental transfers. This raises policy questions regarding the assignment of responsibility for disaster avoidance in the US federation. In particular, Federal âownershipâ of the consequences of disasters may invite or necessitate new forms of Federal âcontrolâ of subnational government.
$1.00 per RT #BostonMarathon #PrayForBoston: analyzing fake content on Twitter
This study found that 29% of the most viral content on Twitter during the Boston bombing crisis were rumors and fake content.AbstractOnline social media has emerged as one of the prominent channels for dissemination of information during real world events. Malicious content is posted online during events, which can result in damage, chaos and monetary losses in the real world. We analyzed one such media i.e. Twitter, for content generated during the event of Boston Marathon Blasts, that occurred on April, 15th, 2013. A lot of fake content and malicious profiles originated on Twitter network during this event. The aim of this work is to perform in-depth characterization of what factors influenced in malicious content and profiles becoming viral. Our results showed that 29% of the most viral content on Twitter, during the Boston crisis were rumors and fake content; while 51% was generic opinions and comments; and rest was true information. We found that large number of users with high social reputation and verified accounts were responsible for spreading the fake content. Next, we used regression prediction model, to verify that, overall impact of all users who propagate the fake content at a given time, can be used to estimate the growth of that content in future. Many malicious accounts were created on Twitter during the Boston event, that were later suspended by Twitter. We identified over six thousand such user profiles, we observed that the creation of such profiles surged considerably right after the blasts occurred. We identified closed community structure and star formation in the interaction network of these suspended profiles amongst themselves
Food calamities and governance; an inventory of approaches
In normal circumstances a governance structure of the food system has evolved that serves the system so as to reduce transaction costs. While its overarching conditions are often set by the government policy as to the sector, the private sector, with the help of an enabling government, has developed arrangements to its own liking. The question addressed in this review is whether this governance structure of the food system is robust enough to cover extreme events, calamities, that strike unexpectedly and may harm large sections of the system. Do normal arrangements cover part of what should be done in these circumstances, or do they perhaps hinder the application of adequate governance fit for such extreme events
Engineering Crowdsourced Stream Processing Systems
A crowdsourced stream processing system (CSP) is a system that incorporates
crowdsourced tasks in the processing of a data stream. This can be seen as
enabling crowdsourcing work to be applied on a sample of large-scale data at
high speed, or equivalently, enabling stream processing to employ human
intelligence. It also leads to a substantial expansion of the capabilities of
data processing systems. Engineering a CSP system requires the combination of
human and machine computation elements. From a general systems theory
perspective, this means taking into account inherited as well as emerging
properties from both these elements. In this paper, we position CSP systems
within a broader taxonomy, outline a series of design principles and evaluation
metrics, present an extensible framework for their design, and describe several
design patterns. We showcase the capabilities of CSP systems by performing a
case study that applies our proposed framework to the design and analysis of a
real system (AIDR) that classifies social media messages during time-critical
crisis events. Results show that compared to a pure stream processing system,
AIDR can achieve a higher data classification accuracy, while compared to a
pure crowdsourcing solution, the system makes better use of human workers by
requiring much less manual work effort
- âŚ