5,997 research outputs found

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio

    Real-time processing of social media with SENTINEL: a syndromic surveillance system incorporating deep learning for health classification

    Get PDF
    Interest in real-time syndromic surveillance based on social media data has greatly increased in recent years. The ability to detect disease outbreaks earlier than traditional methods would be highly useful for public health officials. This paper describes a software system which is built upon recent developments in machine learning and data processing to achieve this goal. The system is built from reusable modules integrated into data processing pipelines that are easily deployable and configurable. It applies deep learning to the problem of classifying health-related tweets and is able to do so with high accuracy. It has the capability to detect illness outbreaks from Twitter data and then to build up and display information about these outbreaks, including relevant news articles, to provide situational awareness. It also provides nowcasting functionality of current disease levels from previous clinical data combined with Twitter data. The preliminary results are promising, with the system being able to detect outbreaks of influenza-like illness symptoms which could then be confirmed by existing official sources. The Nowcasting module shows that using social media data can improve prediction for multiple diseases over simply using traditional data sources

    Shuttle Propulsion System Major Events and the Final 22 Flights

    Get PDF
    Numerous lessons have been documented from the Space Shuttle Propulsion elements. Major events include loss of the Solid Rocket Boosters (SRB's) on STS-4 and shutdown of a Space Shuttle Main Engine (SSME) during ascent on STS-51F. On STS-112 only half the pyrotechnics fired during release of the vehicle from the launch pad, a testament for redundancy. STS-91 exhibited freezing of a main combustion chamber pressure measurement and on STS-93 nozzle tube ruptures necessitated a low liquid level oxygen cut off of the main engines. A number of on pad aborts were experienced during the early program resulting in delays. And the two accidents, STS-51L and STS-107, had unique heritage in history from early program decisions and vehicle configuration. Following STS-51L significant resources were invested in developing fundamental physical understanding of solid rocket motor environments and material system behavior. And following STS-107, the risk of ascent debris was better characterized and controlled. Situational awareness during all mission phases improved, and the management team instituted effective risk assessment practices. The last 22 flights of the Space Shuttle, following the Columbia accident, were characterized by remarkable improvement in safety and reliability. Numerous problems were solved in addition to reduction of the ascent debris hazard. The Shuttle system, though not as operable as envisioned in the 1970's, successfully assembled the International Space Station (ISS). By the end of the program, the remarkable Space Shuttle Propulsion system achieved very high performance, was largely reusable, exhibited high reliability, and was a heavy lift earth to orbit propulsion system. During the program a number of project management and engineering processes were implemented and improved. Technical performance, schedule accountability, cost control, and risk management were effectively managed and implemented. Award fee contracting was implemented to provide performance incentives. The Certification of Flight Readiness and Mission Management processes became very effective. A key to the success of the propulsion element projects was related to relationships between the MSFC project office and support organizations with their counterpart contractor organizations. The teams worked diligently to understand and satisfy requirements and achieve mission success

    Situation Management with Complex Event Processing

    Get PDF
    With the broader dissemination of digital technologies, visionary concepts like the Internet of Things also affect an increasing number of use cases with interfaces to humans, e.g. manufacturing environments with technical operators monitoring the processes. This leads to additional challenges, as besides the technical issues also human aspects have to be considered for a successful implementation of strategic initiatives like Industrie 4.0. From a technical perspective, complex event processing has proven itself in practice to be capable of integrating and analyzing huge amounts of heterogeneous data and establishing a basic level of situation awareness by detecting situations of interests. Whereas this reactive nature of complex event processing systems may be sufficient for machine-to-machine use cases, the new characteristic of application fields with humans remaining in the control loop leads to an increasing action distance and delayed reactions. Taking human aspects into consideration leads to new requirements, with transparency and comprehensibility of the processing of events being the most important ones. Improving the comprehensibility of complex event processing and extending its capabilities towards an effective support of human operators allows tackling technical and non-technical challenges at the same time. The main contribution of this thesis answers the question of how to evolve state-of-the-art complex event processing from its reactive nature towards a transparent and holistic situation management system. The goal is to improve the interaction among systems and humans in use cases with interfaces between both worlds. Realizing a holistic situation management requires three missing capabilities to be introduced by the contributions of this thesis: First, based on the achieved transparency, the retrospective analysis of situations is enabled by collecting information related to a situation\u27s occurrence and development. Therefore, CEP engine-specific situation descriptions are transformed into a common model, allowing the automatic decomposition of the underlying patterns to derive partial patterns describing the intermediate states of processing. Second, by introducing the psychological model of situation awareness into complex event processing, human aspects of information processing are taken into consideration and introduced into the complex event processing paradigm. Based on this model, an extended situation life-cycle and transition method are derived. The introduced concepts and methods allow the implementation of the controlling function of situation management and enable the effective acquisition and maintenance of situation awareness for human operators to purposefully direct their attention towards upcoming situations. Finally, completing the set of capabilities for situation management, an approach is presented to support the generation and integration of prediction models for predictive situation management. Therefore, methods are introduced to automatically label and extract relevant data for the generation of prediction models and to enable the embedding of the resulting models for an automatic evaluation and execution. The contributions are introduced, applied and evaluated along a scenario from the manufacturing domain

    CHORUS Deliverable 2.2: Second report - identification of multi-disciplinary key issues for gap analysis toward EU multimedia search engines roadmap

    Get PDF
    After addressing the state-of-the-art during the first year of Chorus and establishing the existing landscape in multimedia search engines, we have identified and analyzed gaps within European research effort during our second year. In this period we focused on three directions, notably technological issues, user-centred issues and use-cases and socio- economic and legal aspects. These were assessed by two central studies: firstly, a concerted vision of functional breakdown of generic multimedia search engine, and secondly, a representative use-cases descriptions with the related discussion on requirement for technological challenges. Both studies have been carried out in cooperation and consultation with the community at large through EC concertation meetings (multimedia search engines cluster), several meetings with our Think-Tank, presentations in international conferences, and surveys addressed to EU projects coordinators as well as National initiatives coordinators. Based on the obtained feedback we identified two types of gaps, namely core technological gaps that involve research challenges, and “enablers”, which are not necessarily technical research challenges, but have impact on innovation progress. New socio-economic trends are presented as well as emerging legal challenges

    Big Data Analysis

    Get PDF
    The value of big data is predicated on the ability to detect trends and patterns and more generally to make sense of the large volumes of data that is often comprised of a heterogeneous mix of format, structure, and semantics. Big data analysis is the component of the big data value chain that focuses on transforming raw acquired data into a coherent usable resource suitable for analysis. Using a range of interviews with key stakeholders in small and large companies and academia, this chapter outlines key insights, state of the art, emerging trends, future requirements, and sectorial case studies for data analysis
    corecore