2,530 research outputs found

    Alert threshold assessment based on equivalent displacements for the identification of potentially critical landslide events

    Get PDF
    Over the past years, the growing number of natural hazards all over the world has led to an increasing focus on activities aimed at studying and controlling the occurrence of these phenomena. In this context, monitoring systems have become a fundamental component for Landslide Early Warning Systems, allowing to understand the evolution of these processes and assess the need for dedicated mitigation measures. This result is achieved thanks to several technological advancements that led to the introduction of more accurate and reliable sensors, as well as automatic procedures for data acquisition and elaboration. However, despite these improvements, the data interpretation process is still a challenging task, in particular when it comes to the identification of critical events and failure forecasting operations. This paper presents a methodology developed to assess if a potentially critical event is displaying a significant deviation from previously sampled data, or if it could be classified as a false alarm. The process relies on the definition of a threshold value based on the landslide behavior preceding the event of interest. In particular, the reference value derives from the evaluation of equivalent displacements, defined as the displacements previously observed in a time interval equal to the one showed by the potentially critical event. This paper reports a series of examples referring to different case studies, involving both false alarms and real collapses, underlining the effectiveness of the proposed model as a useful tool to evaluate the landslide behavior with a near-real-time approach

    Walking the bridge from single- to multi-species approaches in Southern African fisheries management

    Get PDF
    Fisheries management worldwide is in flux with calls for an EAF (Ecosystem Approach to Fisheries) needing to be balanced with the ongoing requirements to provide timeous and realistic assessment-based advice for management ( often with major economic and social consequences), that is typically based on single-species stock assessment models. This thesis is an attempt to walk the bridge from single- to multi-species approaches to fisheries management by developing a ''traditional" single-species stock assessment model that is used for management purposes, assessing possibilities for extending the model to incorporate multi-species effects and evaluating the potential of a range of multi-species approaches to contribute to the fumishment of practical management advice. The South African abalone Haliotis midae fishery is an example of a commercially valuable resource that is currently experiencing a downturn due to a complicated mix of biological, social, political, economic and environmental factors. Core problems include illegal fishing and recent ecosystem change in the form of a movement of rock lobsters Jasus lalandii into a major part of the range of the abalone. It seems that the lobsters have dramatically reduced sea urchin Parechinus angulosus populations, thereby indirectly negatively impacting juvenile abalone, which rely on the urchins for shelter. A spatial and age-structured production model (ASPM) developed as part of this study has provided the basis for management advice for this resource over recent years by projecting abundance trends under alternative future catch levels. The focus is on the main abalone fishery Zones A-D. The model estimates the reduction in juvenile abalone survival due to the ecosystem change extent and estimates the illegal take using a novel fisheries index - the confiscations per unit of policing effort (CPUPE). As a consequence of the recent explosion of poaching activities, the combined Zones A-D model-predicted 2003 poaching estimate of 933 MT ( corresponding to the assumption that, on average, 36% of all poached abalone are confiscated) is more than seven times the legal 2003 commercial TAC for these Zones. Given the complexity of ecosystem processes, there is a need to critically evaluate the tools used to steer this thinking. The focus here is on both the most widely-employed multi-species/ecosystem approach (ECOPATH with ECOSIM or EwE) as well as a scenario in which there is an urgent need (from management) for scientific evaluations to quantify indirect interactions between marine mammals and fisheries. A critical review of EwE highlights some weaknesses related to, for example, the handling of some life history responses such as compensatory changes in natural mortality rates of marine mammals, overcompensatory stock-recruit relationships, inadequate representation of uncertainty, possible problems in extrapolating from the micro-scale to the macro-scale as well as some (not too far-reaching) mathematical inconsistencies in the underlying equations. Strengths include the structured parameterisation :framework, the inclusion of a well-balanced level of conceptual realism, a novel representation of predator-prey interaction terms and the inclusion of a Bayes-like approach (ECORANGER) to take account of the uncertainty associated with values for model inputs. The potential of EwE to contribute to five important multi-species management quandaries in the marine environments off southern Africa and Antarctica is assessed, leading to the conclusion that EwE has limited predictive capability in these contexts. 3 Aspects of the potential application of other multi-species/ecosystem modelling approaches to advise the management of South African fisheries are discussed. In general, reliable predictive ability from such models is likely to be achieved sooner for top predators because relatively fewer links need to be modelled. Accordingly discussion concentrates on the problems of modelling marine mammal-fisheries interactions. Competition is a primary concern, but existing evidence is inconclusive because of the difficulties of substantiating claims that predation by marine mammals is adversely affecting a fishery or vice versa. Numerous species have been implicated in such conflicts, and long-term studies are essential to evaluate relationships between rates of predation and types and densities of available prey, i.e., functional responses. More realistic modelling studies are needed to address operational or management issues. Such models should reflect uncertainty in data and model structure, describe the influence of model assumptions, focus on systems where there is the greatest chance of success, incorporate a sufficient array of ecological links, and include appropriate spatial and temporal scaling for data collection and modelling exercises. In general, GADGET (Globally Applicable Area-Disaggregated Generic Ecosystem Evaluation Tool) and Minimum Realistic Models (MRM) are seen to show the most promise for use as tools to assess indirect effects between marine mammals and fisheries. The hake-seal-fishery interactions off the west coast of southern Africa are discussed as an example and the initiatives being pursued to further this modelling work are summarized. An important message derived from this study concerns the need to couple multi-species/ecosystem models with a simulation framework to take explicit account of uncertainty and management issues

    Data Mining-based Fragmentation of XML Data Warehouses

    Full text link
    With the multiplication of XML data sources, many XML data warehouse models have been proposed to handle data heterogeneity and complexity in a way relational data warehouses fail to achieve. However, XML-native database systems currently suffer from limited performances, both in terms of manageable data volume and response time. Fragmentation helps address both these issues. Derived horizontal fragmentation is typically used in relational data warehouses and can definitely be adapted to the XML context. However, the number of fragments produced by classical algorithms is difficult to control. In this paper, we propose the use of a k-means-based fragmentation approach that allows to master the number of fragments through its kk parameter. We experimentally compare its efficiency to classical derived horizontal fragmentation algorithms adapted to XML data warehouses and show its superiority

    Applications of Polyhedral Computations to the Analysis and Verification of Hardware and Software Systems

    Get PDF
    Convex polyhedra are the basis for several abstractions used in static analysis and computer-aided verification of complex and sometimes mission critical systems. For such applications, the identification of an appropriate complexity-precision trade-off is a particularly acute problem, so that the availability of a wide spectrum of alternative solutions is mandatory. We survey the range of applications of polyhedral computations in this area; give an overview of the different classes of polyhedra that may be adopted; outline the main polyhedral operations required by automatic analyzers and verifiers; and look at some possible combinations of polyhedra with other numerical abstractions that have the potential to improve the precision of the analysis. Areas where further theoretical investigations can result in important contributions are highlighted.Comment: 51 pages, 11 figure

    ECLAP 2012 Conference on Information Technologies for Performing Arts, Media Access and Entertainment

    Get PDF
    It has been a long history of Information Technology innovations within the Cultural Heritage areas. The Performing arts has also been enforced with a number of new innovations which unveil a range of synergies and possibilities. Most of the technologies and innovations produced for digital libraries, media entertainment and education can be exploited in the field of performing arts, with adaptation and repurposing. Performing arts offer many interesting challenges and opportunities for research and innovations and exploitation of cutting edge research results from interdisciplinary areas. For these reasons, the ECLAP 2012 can be regarded as a continuation of past conferences such as AXMEDIS and WEDELMUSIC (both pressed by IEEE and FUP). ECLAP is an European Commission project to create a social network and media access service for performing arts institutions in Europe, to create the e-library of performing arts, exploiting innovative solutions coming from the ICT

    A Survey on Compiler Autotuning using Machine Learning

    Full text link
    Since the mid-1990s, researchers have been trying to use machine-learning based approaches to solve a number of different compiler optimization problems. These techniques primarily enhance the quality of the obtained results and, more importantly, make it feasible to tackle two main compiler optimization problems: optimization selection (choosing which optimizations to apply) and phase-ordering (choosing the order of applying optimizations). The compiler optimization space continues to grow due to the advancement of applications, increasing number of compiler optimizations, and new target architectures. Generic optimization passes in compilers cannot fully leverage newly introduced optimizations and, therefore, cannot keep up with the pace of increasing options. This survey summarizes and classifies the recent advances in using machine learning for the compiler optimization field, particularly on the two major problems of (1) selecting the best optimizations and (2) the phase-ordering of optimizations. The survey highlights the approaches taken so far, the obtained results, the fine-grain classification among different approaches and finally, the influential papers of the field.Comment: version 5.0 (updated on September 2018)- Preprint Version For our Accepted Journal @ ACM CSUR 2018 (42 pages) - This survey will be updated quarterly here (Send me your new published papers to be added in the subsequent version) History: Received November 2016; Revised August 2017; Revised February 2018; Accepted March 2018

    Training of Crisis Mappers and Map Production from Multi-sensor Data: Vernazza Case Study (Cinque Terre National Park, Italy)

    Get PDF
    This aim of paper is to presents the development of a multidisciplinary project carried out by the cooperation between Politecnico di Torino and ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action). The goal of the project was the training in geospatial data acquiring and processing for students attending Architecture and Engineering Courses, in order to start up a team of "volunteer mappers". Indeed, the project is aimed to document the environmental and built heritage subject to disaster; the purpose is to improve the capabilities of the actors involved in the activities connected in geospatial data collection, integration and sharing. The proposed area for testing the training activities is the Cinque Terre National Park, registered in the World Heritage List since 1997. The area was affected by flood on the 25th of October 2011. According to other international experiences, the group is expected to be active after emergencies in order to upgrade maps, using data acquired by typical geomatic methods and techniques such as terrestrial and aerial Lidar, close-range and aerial photogrammetry, topographic and GNSS instruments etc.; or by non conventional systems and instruments such us UAV, mobile mapping etc. The ultimate goal is to implement a WebGIS platform to share all the data collected with local authorities and the Civil Protectio
    corecore