40 research outputs found

    RAPID WEBGIS DEVELOPMENT FOR EMERGENCY MANAGEMENT

    Get PDF
    The use of spatial data during emergency response and management helps to make faster and better decisions. Moreover spatial data should be as much updated as possible and easy to access. To face the challenge of rapid and updated data sharing the most efficient solution is largely considered the use of internet where the field of web mapping is constantly evolving. ITHACA (Information Technology for Humanitarian Assistance, Cooperation and Action) is a non profit association founded by Politecnico di Torino and SITI (Higher Institute for the Environmental Systems) as a joint project with the WFP (World Food Programme). The collaboration with the WFP drives some projects related to Early Warning Systems (i.e. flood and drought monitoring) and Early Impact Systems (e.g. rapid mapping and assessment through remote sensing systems). The Web GIS team has built and is continuously improving a complex architecture based entirely on Open Source tools. This architecture is composed by three main areas: the database environment, the server side logic and the client side logic. Each of them is implemented respecting the MCV (Model Controller View) pattern which means the separation of the different logic layers (database interaction, business logic and presentation). The MCV architecture allows to easily and fast build a Web GIS application for data viewing and exploration. In case of emergency data publication can be performed almost immediately as soon as data production is completed. The server side system is based on Python language and Django web development framework, while the client side on OpenLayers, GeoExt and Ext.js that manage data retrieval and user interface. The MCV pattern applied to javascript allows to keep the interface generation and data retrieval logic separated from the general application configuration, thus the server side environment can take care of the generation of the configuration file. The web application building process is data driven and can be considered as a view of the current architecture composed by data and data interaction tools. Once completely automated, the Web GIS application building process can be performed directly by the final user, that can customize data layers and controls to interact with the

    Manual for CEMS-Rapid Mapping Products: Valid for the portfolio since April 2019, status 31 August 2020

    Get PDF
    This Manual provides guidance on the use and interpretation of products delivered by the Rapid Mapping service. Rapid Mapping is a module of the Copernicus Emergency Management Service (CEMS) - one of the six core services of the European Union’s Earth observation programme Copernicus. The JRC is responsible for CEMS and implementing a part of it through service contracts with European industry and academia. Rapid Mapping is one of the two modules under CEMS’ Mapping component, which delivers on-demand geospatial information derived from remote sensing data to support emergencies that require an immediate response. Rapid Mapping in particular provides in rush mode (24/7) geospatial information on the impact of a selected disaster anywhere in the world from optical and radar satellite images (typically at resolutions <10m). It can be directly triggered by the so-called authorised users in the European Member States and other countries participating in the European Civil Protection Mechanism (one authorised user per country, typically the civil protection authority) through DG ECHO’s Emergency Response Coordination Centre in Brussels. Not authorised users can activate the service through an authorised user. This manual is the reference document to which any user should refer for handling Rapid Mapping products. The product delivery package of Rapid Mapping contains a series of ready-to-print maps and a vector data package. This manual gives a detailed overview of the product characteristics, describing hence the ready-to-print maps and the vector data package. The document is an updated and extended version of the “Product User Manual of Copernicus EMS Rapid Mapping” (Dorati et al., 2018) and reflects changes which were introduced in the product portfolio in April 2019. The initial version was extended to increase transparency and usability of the products. Added were sections on e.g. ready-to-print maps, quality control, product releases. The content of this manual is also available online at https://emergency.copernicus.eu/mapping/ems/online-manual-rapid-mapping-products. This technical report reflects the status as of 31 August 2020. The online manual is updated regularly whenever minor revisions are made. New versions of this technical report will be issued only in case there are major changes to the portfolio.JRC.E.1-Disaster Risk Managemen

    2017 User Workshop of the Copernicus Emergency Management Service – Summary Report

    Get PDF
    This report summarises the User Workshop of the Copernicus Emergency Management Service (EMS) – Mapping component which was held on 20-21 June 2017 at the Joint Research Centre (JRC) in Ispra, Italy. The User Workshop is the annual forum at which users, service providers, the Commission and other stakeholders exchange views and experiences of the Copernicus EMS - Mapping component. It was attended by 50 participants from across Europe, of whom eighteen were users of this service component. The focus of the User Workshop was on the two on-demand Mapping services - i.e. “Rapid Mapping” and “Risk and Recovery Mapping” - which provide geo-spatial information in support to all phases of disaster management. The information is mainly derived from satellite imagery and complemented by available ancillary data. During the first day of the Workshop, the focus was on providing insights in the technical and scientific capacity of the “Risk & Recovery” Mapping service, which delivers maps and analysis in support of disaster risk reduction, preparedness and prevention, recovery and reconstruction. The aim of this part of the Workshop was to increase awareness of this service module, which is less known than the “Rapid Mapping” service - the “24/7” (i.e. always on) service supporting emergency response operations. Users were invited to present their experience with both service modules, while a live demo of Unmanned Aerial Systems (UAS), was made, in order to show the potential of these platforms in the context of the fast provision of airborne imagery in an emergency situation. The second day of the Workshop addressed the evolution of Copernicus EMS - Mapping. Two Horizon 2020 projects were introduced and discussed: while iREACT (http://www.i-react.eu/) looks at exploiting advanced cyber technologies for disaster management, E2mC (https://www.e2mc-project.eu/) focuses on exploiting social data and crowdsourcing for use in Rapid Mapping. Other evolution-related topics addressed were links with the two Copernicus EMS Early Warning Systems (i.e. the European Flood Awareness System and the European Forest Fire Information System), product dissemination and potential new products. All topics were further discussed in groups. As every year, the discussions at the User Workshop are summarised and processed by the JRC, with a view to guiding the overall evolution of the service. The workshop agenda and presentations are available at: http://emergency.copernicus.eu/mapping/ems/copernicus-ems-mapping-user-workshop-2017JRC.E.1-Disaster Risk Managemen

    Probiotic Sonicates Selectively Induce Mucosal Immune Cells Apoptosis through Ceramide Generation via Neutral Sphingomyelinase

    Get PDF
    This is an open-access article distributed under the terms of the Creative Commons Attribution License.-- et al.[Background]: Probiotics appear to be beneficial in inflammatory bowel disease, but their mechanism of action is incompletely understood. We investigated whether probiotic-derived sphingomyelinase mediates this beneficial effect. [Methodology/Principal Findings]: Neutral sphingomyelinase (NSMase) activity was measured in sonicates of the probiotic L. brevis (LB) and S. thermophilus (ST) and the non-probiotic E. coli (EC) and E. faecalis (EF). Lamina propria mononuclear cells (LPMC) were obtained from patients with Crohn's disease (CD) and Ulcerative Colitis (UC), and peripheral blood mononuclear cells (PBMC) from healthy volunteers, analysing LPMC and PBMC apoptosis susceptibility, reactive oxygen species (ROS) generation and JNK activation. In some experiments, sonicates were preincubated with GSH or GW4869, a specific NSMase inhibitor. NSMase activity of LB and ST was 10-fold that of EC and EF sonicates. LB and ST sonicates induced significantly more apoptosis of CD and UC than control LPMC, whereas EC and EF sonicates failed to induce apoptosis. Pre-stimulation with anti-CD3/CD28 induced a significant and time-dependent increase in LB-induced apoptosis of LPMC and PBMC. Exposure to LB sonicates resulted in JNK activation and ROS production by LPMC. NSMase activity of LB sonicates was completely abrogated by GW4869, causing a dose-dependent reduction of LB-induced apoptosis. LB and ST selectively induced immune cell apoptosis, an effect dependent on the degree of cell activation and mediated by bacterial NSMase. [Conclusions]: These results suggest that induction of immune cell apoptosis is a mechanism of action of some probiotics, and that NSMase-mediated ceramide generation contributes to the therapeutic effects of probiotics.The funding sources included grants from Centro de Investigación Biomédica en Red de Enfermedades Hepáticas y Digestivas (CIBERehd), Ministerio de Ciencia e Innovación (SAF2005-00280 and SAF2008-03676 to MS, FIS2009-00056 to AM, SAF2009-11417 to JCF), Fundación Ramón Areces (to MS), the National Institutes of Health (DK30399 and DK50984 to CF) and the Research Center for Liver and Pancreatic Diseases funded by the United States National Institute for Alcohol Abuse and Alcoholism (P50 AA 11999 to JCF).Peer reviewe

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    WebGis Architectures for Emergency Response

    No full text
    In the era of information internet is the main source from which to find anything. The story of internet is recent and brief, about 20 years, during which it has evolved continuously and quickly. The information given in the first years was flat like pure text or documents but in the last 10 years the type of information shared has changed, moving to other dimensions from the flat page. Now we can experience videos, music and even maps. The concept of attaching geographic information to the information itself has spread on almost every site. Maps are everywhere because the question “where does this come from?” has assumed the same importance as the “what”. Georeferencing and managing spatial information is not the same as managing non spatial information because there is an increase in complexity due to the fact that geographic information is not absolute. In the world there are thousands of “spatial reference systems” (srs) and different standards on how to share spatial data. Whoever hosts spatial data must allow the gathering of data in different formats and different “srs”. On the other hand there is the fact that with spatial data we can perform spatial analysis such as distance calculations, routing or proximity. Many services which deal with spatial data even offer some additional services such as Google Maps. In many cases the web spatial information is gaining much importance, paper maps have always been used to plan the environment and the actions on it, but digital mapping allows more degrees of freedom granting the ability to customize what we want and how we want it presented wherever we are. The case of humanitarian emergencies is the core example about how the management of information and the usage of it are totally displaced. Whoever manages and keeps spatial data can be in a HQ office and people in the field can use information without worrying about getting it physically. The only thing needed is an internet link, which now is granted by satellite communication even if on the ground there is lack of infrastructures. This PhD thesis has been developed in the framework of the collaboration between Information Technology for Humanitarian Assistance, Cooperation and Action (ITHACA), a non profit association founded by the Politecnico di Torino and the Istituto Superiore per i Sistemi Territoriali e l'Innovazione (SITI) and the World Food Programme (WFP). The collaboration with the WFP characterizes the research projects in the field of humanitarian emergencies and the goal of the present thesis. One of the core things that determine a good emergency response is data availability. This term means both immediate availability after the event and quality availability, having the most up to date and precise data. To meet the challenge of rapid WebGis development and data sharing in case of emergency, this work has focused on the development of an architecture composed by Open Source tools integrated and tuned to allow the building of WebGis and web applications in short time but maintaining an high level of customization. The panorama in the geographic Open Source community is populated by several tools dedicated to specific tasks like data publishing, data management or user interface; however these projects have always been independent from each other without a common design able to harmonize the efforts in one single tool. The idea is to have a unique server architecture composed of data publishing and management systems able to output data in different ways based on the web HTTP protocol and respecting international standards. Web applications should act as custom views on the architecture and web services should allow access to data regardless of the requesting source. The development process has involved deep research into basic informatic tools and operating systems as well as high level programming languages. Following several web applications, an environment able to manage different technologies and data sources has been built. The goal is to test the proposed solution during real case studies in order to exploit usability, stability as well as effectiveness during emergency

    WebGis Architectures for Emergency Response

    No full text
    In the era of information internet is the main source from which to find anything. The story of internet is recent and brief, about 20 years, during which it has evolved continuously and quickly. The information given in the first years was flat like pure text or documents but in the last 10 years the type of information shared has changed, moving to other dimensions from the flat page. Now we can experience videos, music and even maps. The concept of attaching geographic information to the information itself has spread on almost every site. Maps are everywhere because the question "where does this come from?" has assumed the same importance as the "what". Georeferencing and managing spatial information is not the same as managing non spatial information because there is an increase in complexity due to the fact that geographic information is not absolute. In the world there are thousands of "spatial reference systems" (srs) and different standards on how to share spatial data. Whoever hosts spatial data must allow the gathering of data in different formats and different "srs". On the other hand there is the fact that with spatial data we can perform spatial analysis such as distance calculations, routing or proximity. Many services which deal with spatial data even offer some additional services such as Google Maps. In many cases the web spatial information is gaining much importance, paper maps have always been used to plan the environment and the actions on it, but digital mapping allows more degrees of freedom granting the ability to customize what we want and how we want it presented wherever we are. The case of humanitarian emergencies is the core example about how the management of information and the usage of it are totally displaced. Whoever manages and keeps spatial data can be in a HQ office and people in the field can use information without worrying about getting it physically. The only thing needed is an internet link, which now is granted by satellite communication even if on the ground there is lack of infrastructures. This PhD thesis has been developed in the framework of the collaboration between Information Technology for Humanitarian Assistance, Cooperation and Action (ITHACA), a non profit association founded by the Politecnico di Torino and the Istituto Superiore per i Sistemi Territoriali e l'Innovazione (SITI) and the World Food Programme (WFP). The collaboration with the WFP characterizes the research projects in the field of humanitarian emergencies and the goal of the present thesis. One of the core things that determine a good emergency response is data availability. This term means both immediate availability after the event and quality availability, having the most up to date and precise data. To meet the challenge of rapid WebGis development and data sharing in case of emergency, this work has focused on the development of an architecture composed by Open Source tools integrated and tuned to allow the building of WebGis and web applications in short time but maintaining an high level of customization. The panorama in the geographic Open Source community is populated by several tools dedicated to specific tasks like data publishing, data management or user interface; however these projects have always been independent from each other without a common design able to harmonize the efforts in one single tool. The idea is to have a unique server architecture composed of data publishing and management systems able to output data in different ways based on the web HTTP protocol and respecting international standards. Web applications should act as custom views on the architecture and web services should allow access to data regardless of the requesting source. The development process has involved deep research into basic informatic tools and operating systems as well as high level programming languages. Following several web applications, an environment able to manage different technologies and data sources has been built. The goal is to test the proposed solution during real case studies in order to exploit usability, stability as well as effectiveness during emergenc
    corecore