194 research outputs found

    A TWITTER-INTEGRATED WEB SYSTEM TO AGGREGATE AND PROCESS EMERGENCY-RELATED DATA

    Get PDF
    A major challenge when encountering time-sensitive, information critical emergencies is to source raw volunteered data from on-site public sources and extract information which can enhance awareness on the emergency itself from a geographical context. This research explores the use of Twitter in the emergency domain by developing a Twitter-integrated web system capable of aggregating and processing emergency-related tweet data. The objectives of the project are to collect volunteered tweet data on emergencies by public citizen sources via the Twitter API, process the data based on geo-location information and syntax into organized informational entities relevant to an emergency, and subsequently deliver the information on a map-like interface. The web system framework is targeted for use by organizations which seek to transform volunteered emergency-related data available on the Twitter platform into timely, useful emergency alerts which can enhance situational awareness, and is intended to be accessible to the public through a user-friendly web interface. Rapid Application Development (RAD) is the methodology of choice for project development. The developed system has a system usability scale score of 84.25, after results were tabulated from a usability survey on 20 respondents. Said system is best for use in emergencies where the transmission timely, quantitative data is of paramount importance, and is a useful framework on extracting and displaying useful emergency alerts with a geographical perspective based on volunteered citizen Tweets. It is hoped that the project can ultimately contribute to the existing domain of knowledge on social media-assisted emergency applications

    Master of Science

    Get PDF
    thesisThe 2012 Great Utah Shakeout highlighted the necessity for increased coordination in the collection and sharing of spatial data related to disaster response during an event. Multiple agencies must quickly relay scientific and damage observations between teams in the field and command centers. Spatial Data Infrastructure (SDI) is a framework that directly supports information discovery and access and use of the data in decision making processes. An SDI contains five core components: policies, access networks, data handling facilities, standards, and human resources needed for the effective collection, management, access, delivery, and utilization of spatial data for a specific area. Implementation of an SDI will increase communication between agencies, field-based reconnaissance teams, first responders, and individuals in the event of a disaster. The increasing popularity of location-based mobile social networks has led to spatial data from these sources being used in the context of managing disaster response and recovery. Spatial data acquired from social networks, or Volunteer Geographic Information (VGI), could potentially contribute thousands of low-cost observations to aid in damage assessment and recovery efforts that may otherwise be unreported. The objective of this research is to design and develop an SDI to allow the incorporation of VGI, professional Geographic Information System (GIS) layers, a mobile application, and scientific reports to aid in the disaster management process. A secondary goal is to assess the utility of the resulting SDI. The end result of combining the three systems (e.g., SDE, a mobile application, and VGI), along with the network of relevant users, is an SDI that improves the volume, quality, currency, accuracy, and access to vital spatial and scientific information following a hazard event

    1st year EFAST annual report

    Get PDF
    The present report provides information about the activities conducted during the 1st year of the EFAST project. The first chapter is dedicated to describe the inquiries conducted at the beginning of the project and to briefly summarise the main results. The second chapter is dedicated to the first EFAST workshop where some of the leading scientists in the field of earthquake engineering have met to discuss about the need and the technologies related to earthquake engineering. The third chapter contains a state of the art and future direction in seismic testing and simulation. The final chapter is dedicated to describe the preliminary design of the web portal of the future testing facility.JRC.DG.G.5-European laboratory for structural assessmen

    Identifying success factors in crowdsourced geographic information use in government

    Get PDF
    Crowdsourcing geographic information in government is focusing on projects that are engaging people who are not government officials and employees in collecting, editing and sharing information with governmental bodies. This type of projects emerged in the past decade, due to technological and societal changes - such as the increased use of smartphones, combined with growing levels of education and technical abilities to use them by citizens. They also flourished due to the need for updated data in relatively quick time when financial resources are low. They range from recording the experience of feeling an earthquake to recording the location of businesses during the summer time. 50 cases of projects in which crowdsourced geographic information was used by governmental bodies across the world are analysed. About 60% of the cases were examined in 2014 and in 2017, to allow for comparison and identification of success and failure. The analysis looked at different aspects and their relationship to success: the drivers to start a project; scope and aims; stakeholders and relationships; inputs into the project; technical and organisational aspect; and problems encountered. The main key factors of the case studies were analysed with the use of Qualitative Comparative Analysis (QCA) which is an analytical method that combines quantitative and qualitative tools in sociological research. From the analysis, we can conclude that there is no ā€œmagic bulletā€ or a perfect methodology for a successful crowdsourcing in government project. Unless the organisation has reached maturity in the area of crowdsourcing, identifying a champion and starting a project that will not address authoritative datasets directly is a good way to ensure early success and start the process of organisational learning on how to run such projects. Governmental support and trust is undisputed. If the choice is to use new technologies, this should be accompanied by an investment of appropriate resources within the organisation to ensure that the investment bear fruits. Alternatively, using an existing technology that was successful elsewhere and investing in training and capacity building is another path for success. We also identified the importance of intermediary Non-Governmental Organizations (NGOs) with the experience and knowledge in working with crowdsourcing within a partnership. These organizations have the knowledge and skills to implement projects at the boundary between government and the crowd, and therefore can offer the experience to ensure better implementation. Changes and improvement of public services, or a focus on environmental monitoring can be a good basis for a project. Capturing base mapping is a good point to start, too. The recommendation of the report address organisational issues, resources, and legal aspects

    DiSECCS - final summary report. Work packages, 1 - 4

    Get PDF
    Seismic techniques comprise the key geophysical toolset for imaging and characterising induced changes in the subsurface associated with human activity. This ability to observe and quantify changes in fluid saturation, pressure and geological stress and strain using active and passive seismic techniques has critical application to the monitoring of geological CO2 storage. The DiSECCS project (Diagnostic Seismic Toolbox for Efficient Control of CO2 Storage) has developed seismic monitoring tools and methodologies to identify and characterise injectioninduced changes, whether of fluid saturation or pressure, in storage reservoirs. We have developed guidelines for the monitoring systems and protocols required to maintain the integrity of storage reservoirs suitable for large-scale CO2 storage. The focus is on storage in saline aquifers (comprising the largest potential global storage resource), where considerable amounts of in situ water have to be displaced and both pressure and two-phase flow effects have consequences for storage integrity and storage capacity. Underground storage of CO2 is associated with significant levels of public concern. A better understanding of this is a key element of establishing monitoring protocols to instil wider public confidence in CO2 storage. DiSECCS draws on analogue activities, such as ā€˜frackingā€™ for shale gas, in conjunction with a discursive process involving lay participants, to gain insights into how people engage with similar underground activities and how controversies surrounding particular projects develop and evolve
    • ā€¦
    corecore