29 research outputs found

    Shallow landslide susceptibility assessment in a data-poor region of Guatemala (Comitancillo Municipality)

    Get PDF
    Although landslides are frequent natural phenomena in mountainous regions, the lack of data in emerging countries is a significant issue in the assessment of shallow landslide susceptibility. A key factor in risk-mitigation strategies is the evaluation of deterministic physical models for hazard assessment in these data-poor regions. Given the lack of physical information, input parameters to these data-intensive deterministic models have to be estimated, which has a negative impact on the reliability of the assessment. To address this problem, we examined shallow landslide hazard in Comitancillo municipality, Guatemala. Shallow landslides are here defined as small (less than two or three metre-deep) rotational or translational slides or earth flows. We based our hazard simulation on the stability index mapping model. The model’s input parameters were estimated from a statistical analysis of factors affecting landslides in the municipality obtained from a geodatabase. The outputs from the model were analysed and compared to an inventory of small-scale landslides. The results of the comparison show the effectiveness of the method developed to estimate input parameters for a deterministic model, in regions where physical data related to the assessment of shallow landslide susceptibility is lacking

    Distributed control in virtualized networks

    Get PDF
    The increasing number of the Internet connected devices requires novel solutions to control the next generation network resources. The cooperation between the Software Defined Network (SDN) and the Network Function Virtualization (NFV) seems to be a promising technology paradigm. The bottleneck of current SDN/NFV implementations is the use of a centralized controller. In this paper, different scenarios to identify the pro and cons of a distributed control-plane were investigated. We implemented a prototypal framework to benchmark different centralized and distributed approaches. The test results have been critically analyzed and related considerations and recommendations have been reported. The outcome of our research influenced the control plane design of the following European R&D projects: PLATINO, FI-WARE and T-NOVA

    DESIRE Simulation tool to demonstrate data products for security applications

    No full text
    International audienceThe Simulation Tool to demonstrate Data Products for Security Applications (DESIRE) is a simulation tool aimed at demonstrating the added value of including a thermal infrared (TIR) imager within different space-borne architecture options comprising different capabilities i.e. SAR and optical. The simulator has been developed considering as end users system designers that need to assess the added value of infrared data when they are combined with other data. The DESIRE Tool development has been based on mission scenarios that address the priority areas identified in the GMES services for security e.g. Border security, Maritime Surveillance and Support to EU External Action. Particular relevant scenarios taken into account for the simulator user requirements analysis have been Oil Spill Detection, Maritime Ship Surveillance, Industrial Site Monitoring and Urban Heat Islands.The simulator is composed of an external interface capable of ingesting different input products at different processing levels (from L0 to L2, depending of the data type), a processing chain for each data type to bring the products up to L3, a co-registration module, different data combination and data fusion techniques (in order to generate merged maps or maps with information extracted from different products), and a set of modules to customize and validate the data-fusion products depending on the scenario under investigation.DESIRE has been implemented as a flexible, configurable and modular simulation tool, to be used for existing and firmly planned in-orbit capability and to combine these with real or synthetic TIR data products. DESIRE is based on the simulation framework OpenSF.The modular design of DESIRE allows the future extension of the simulator functionality with additional processing modules in order to deal with a wider range of scenarios and in-orbit architectures

    THE H2020 PROJECT REDSHIFT: OVERVIEW, FIRST RESULTS AND PERSPECTIVES

    Get PDF
    The ReDSHIFT (Revolutionary Design of Spacecraft through Holistic Integration of Future Technologies) project has been approved by the European Community in the framework of the H2020 Protec 2015 call, focused on passive means to reduce the impact of Space Debris by prevention, mitigation and protection. In ReDSHIFT these goals will be achieved through a holistic approach that considers, from the outset, opposing and challenging constraints for the space environment preservation, the spacecraft survivability in the harsh space environment and the safety of humans on ground. The main innovative aspects of the project concern a synergy between theoretical and experimental aspects, such as: long term simulations, astrodynamics, passive de-orbiting devices, 3D printing, design for demise, hypervelocity impact testing, legal and normative issues. The paper presents a quick overview of the first ReDSHIFT results in an effort to highlight the holistic approach of the project covering different aspects of the space debris mitigation field. De- tailed reports on the results of the single Work Packages can be found in other papers in this same volume

    Shallow landslide susceptibility assessment in a data-poor region of Guatemala (Comitancillo municipality)

    No full text
    Although landslides are frequent natural phenomena in mountainous regions, the lack of data in emerging countries is a significant issue in the assessment of shallow landslide susceptibility. A key factor in risk-mitigation strategies is the evaluation of deterministic physical models for hazard assessment in these data-poor regions. Given the lack of physical information, input parameters to these data-intensive deterministic models have to be estimated, which has a negative impact on the reliability of the assessment. To address this problem, we examined shallow landslide hazard in Comitancillo municipality, Guatemala. Shallow landslides are here defined as small (less than two or three metre-deep) rotational or translational slides or earth flows. We based our hazard simulation on the stability index mapping model. The model’s input parameters were estimated from a statistical analysis of factors affecting landslides in the municipality obtained from a geodatabase. The outputs from the model were analysed and compared to an inventory of small-scale landslides. The results of the comparison show the effectiveness of the method developed to estimate input parameters for a deterministic model, in regions where physical data related to the assessment of shallow landslide susceptibility is lacking

    A distributed load balancing algorithm for the control plane in software defined networking

    No full text
    The increasing demand of bandwidth, low latency and reliability, even in mobile scenarios, has pushed the evolution of the networking technologies in order to satisfy the requirements of innovative services. In this context, Software Defined Networking (SDN), namely a new networking paradigm that proposes the decoupling of the control plane from the forwarding plane, enables network control centralization and automation of the network management. In order to address the performance issues related to the SDN Control Plane, this paper proposes a distributed load balancing algorithm with the aim of dynamically balancing the control traffic across a cluster of SDN Controllers, thus minimizing the latency and increasing the overall cluster throughput. The algorithm is based on game theory and converges to a specific equilibrium known as Wardrop equilibrium. Numerical simulations show that the proposed algorithm outperforms a standard static configuration approach

    A future internet interface to control programmable networks

    No full text
    current internet infrastructure is still configured and managed manually or adopting a limited level of automation. The Future Internet aims to provide the network resources as a service to ease the process of automatic designing, controlling and supervising the telecommunication infrastructure. A key enabler of the Future Internet is the virtualization of the available resources and of the related functionalities. The widespread of cloud computing, Software Defined Network (SDN) and Network Function Virtualization (NFV) technologies opened the way for a total control of programmable networks. Many open and commercial implementations have adopted this paradigm, but they expose a fragmented set of dissimilar interfaces that often offer similar or even overlapping functionalities. The result is that uncontrolled, open-loop routines and procedures still require a manual intervention. In this paper, we describe an open interface and its reference implementation, to control programmable networks adopting a novel, closed-loop approach based on end-users feedbacks. The proposed interface has been implemented as a Future Internet Generic Enabler named OFNIC

    PFAT: Post-flight analysis toolkit

    No full text
    Throughout every phase of a space mission, from conceptual design until disposal, a large amount of data is generated for each different engineering discipline involved. Most of this data is produced during the design phases, but additional information is gathered during mission operations. This relates directly to flight performance, which needs to be evaluated by conducting a post-flight analysis. To this end, and to additionally identify and understand the cause of potential discrepancies between predicted and observed behaviour, a large amount of data needs to be analysed methodically. This is key to increase the fidelity of analysis and prediction tools, as their outcomes are used for future mission design iterations. PFAT (Post-Flight Analysis Toolkit) is an open-source software, developed under an ESA contract, which aims at increasing the automation of this process. It allows for the extraction of figures-of-merit and uncertainties and for the derivation of engineering criteria for further design. PFAT implements a Common Data Structure (CDS) to treat data from different engineering domains in a homogeneous way. The domains covered by PFAT are aero(thermo) dynamics, structural and thermal analysis, propulsion system modelling and trajectory simulation. All modules are written in Python3, compatibly with ESA’s Open Simulation Framework (openSF), allowing advanced processing chains definition. By virtue of its modular approach, the toolkit can be easily extended with more functionalities, bringing the postprocessing automatization, data exchange simplicity and processing robustness to other engineering disciplines. An extensive validation campaign has been designed and executed after the implementation phase to ensure that the modules compatibility is guaranteed and that the functionalities available for all the engineering domains provide the expected results when compared to external tools. These end-to-end tests are designed by leveraging the openSF capability to join different modules into complex processing chains that simulate real operational scenarios and analyses

    Checkpoint Inhibitors as High-Grade Gliomas Treatment: State of the Art and Future Perspectives

    No full text
    Glioblastoma (GBM) is the most common and aggressive malignant brain tumor in adults. Despite significant efforts, no therapies have demonstrated valuable survival benefit beyond the current standard of care. Immune checkpoint inhibitors (ICI) have revolutionized the treatment landscape and improved patient survival in many advanced malignancies. Unfortunately, these clinical successes have not been replicated in the neuro-oncology field so far. This review summarizes the status of ICI investigation in high-grade gliomas, critically presenting the available data from preclinical models and clinical trials. Moreover, we explore new approaches to increase ICI efficacy, with a particular focus on combinatorial strategies, and the potential biomarkers to identify patients most likely to benefit from immune checkpoint blockade
    corecore