93 research outputs found

    Ricostruzione di un data-set ad alta risoluzione (30 secondi d'arco) di temperature massime e minime giornaliere per un'area ad orografia complessa (Trentino Alto-Adige)

    Get PDF
    Dati climatici ad alta risoluzione sono attualmente molto richiesti essendo indispensabili per la valutazione degli impatti dei cambiamenti climatici alla scala locale in svariati campi d'applicazione. Per aumentare l'offerta di tali dati per il territorio italiano viene presentata in questo studio la realizzazione di un data-set con risoluzione di trenta secondi d'arco, per le temperature massime e minime giornaliere per il Trentino Alto Adige, per il periodo che va dal 1951 al 2014. La metodologia utilizzata per proiettare i dati meteorologici di un set di stazioni su di un grigliato ad alta risoluzione si basa sull'assunzione che la struttura spazio-temporale del campo di una variabile meteorologica su una determinata area possa essere descritta dalla sovrapposizione di due campi:i valori normali relativi e un periodo standard, ovvero la climatologia,e le deviazioni da questi, ovvero le anomalie. La climatologia mensile verrĂ  interpolata sull'intero dominio tramite una regressione lineare pesata della temperatura rispetto alla quota,stimata separatamente per ogni nodo del grigliato,con pesi legati alla topografia del territorio,in modo da attribuire di volta in volta la massima importanza alle stazioni con caratteristiche piĂč simili a quella del punto di griglia considerato. Da questa sarĂ  possibile tramite la sovrapposizione con le anomalie mensili ricostruite sul medesimo grigliato, ottenute mediante un'interpolazione basata su una media pesata,ottenere un grigliato a 30 secondi d'arco, di serie temporali mensili in valori assoluti. Combinando poi l'interpolazione dei rapporti delle anomalie giornaliere relative alla media mensile per un set di stazioni con i campi mensili precedentemente stimati,sarĂ  possibile costruire il data-set a risoluzione giornaliera. Prima di quest'ultima fase sarĂ  necessario effettuare un'operazione di sincronizzazione dei dati giornalieri per assicurarsi che non vi siano sfasamenti nelle serie utilizzate. I risultati confermano l'efficacia nell'utilizzo di tale metodo su regioni orograficamente complesse, sia nel confronto diretto con i casi di studio,nei quali si nota bene la discriminazione spaziale effettuata dal modello, che nella valutazione dell'accuratezza e della precisione dei risultati. I dati ottenuti non sono affetti da errori sistematici,mentre l'errore medio assoluto risulta pari od inferiore ai 2∘2^{\circ}C, in linea con precedenti studi realizzati su altre aree alpine. Il metodo e i risultati risultano soddisfacenti ma ulteriormente migliorabili, sia tramite un ulteriore ottimizzazione del modello usato, che con un aumento nella qualitĂ  dei dati sui quali Ăš stato svolto lo studio

    The design, deployment, and testing of kriging models in GEOframe with SIK-0.9.8

    Get PDF
    This work presents a software package for the interpolation of climatological variables, such as temperature and precipitation, using kriging techniques. The purposes of the paper are (1) to present a geostatistical software that is easy to use and easy to plug in to a hydrological model; (2) to provide a practical example of an accurately designed software from the perspective of reproducible research; and (3) to demonstrate the goodness of the results of the software and so have a reliable alternative to other, more traditional tools. A total of 11 types of theoretical semivariograms and four types of kriging were implemented and gathered into Object Modeling System-compliant components. The package provides real-time optimization for semivariogram and kriging parameters. The software was tested using a year's worth of hourly temperature readings and a rain storm event (11 h) recorded in 2008 and retrieved from 97 meteorological stations in the Isarco River basin, Italy. For both the variables, good interpolation results were obtained and then compared to the results from the R package gstat

    Pilot-scale Production and Viability Analysis of Freeze-Dried Probiotic Bacteria Using Different Protective Agents

    Get PDF
    The functional food industry requires an improvement of probiotic strain stability during storage, especially when they are stored at room temperature. In this study, the viability of freeze-dried Lactobacillus rhamnosus IMC 501Âź and Lactobacillus paracasei IMC 502Âź using different protective agents (i.e., glycerine, mannitol, sorbitol, inulin, dextrin, CrystaleanÂź) was determined and compared with semi skimmed milk (SSM) control. No significant differences were observed between the tested protectants and the control (SSM) during storage at refrigerated conditions. During storage at room temperature, only glycerine was found to stabilize viability better than other tested substances

    The Italian open data meteorological portal: MISTRAL

    Get PDF
    AbstractAt the national level, in Italy, observational and forecast data are collected by various public bodies and are often kept in various small, heterogeneous and non‐interoperable repositories, released under different licenses, thus limiting the usability for external users. In this context, MISTRAL (the Meteo Italian SupercompuTing PoRtAL) was launched as the first Italian meteorological open data portal, with the aim of promoting the reuse of meteorological data sets available at national level coverage. The MISTRAL portal provides (and archives) meteorological data from various observation networks, both public and private, and forecast data that are generated and post‐processed within the Consortium for Small‐scale Modeling‐Limited Area Model Italia (COSMO‐LAMI) agreement using high performance computing (HPC) facilities. Also incorporated is the Italy Flash Flood use case, implemented with the collaboration of European Centre for Medium‐Range Weather Forecasts (ECMWF), which exploits cutting edge advances in HPC‐based post‐processing of ensemble precipitation forecasts, for different model resolutions, and applies those to deliver novel blended‐resolution forecasts specifically for Italy. Finally, in addition to providing architectures for the acquisition and display of observational data, MISTRAL also delivers an interactive system for visualizing forecast data of different resolutions as superimposed multi‐layer maps

    Comparing Evapotranspiration Estimates from the GEOframe-Prospero Model with Penman–Monteith and Priestley-Taylor Approaches under Different Climate Conditions

    No full text
    Evapotranspiration (ET) is a key variable in the hydrological cycle and it directly impacts the surface balance and its accurate assessment is essential for a correct water management. ET is difficult to measure, since the existing methods for its direct estimate, such as the weighing lysimeter or the eddy-covariance system, are often expensive and require well-trained research personnel. To overcome this limit, different authors developed experimental models for indirect estimation of ET. However, since the accuracy of ET prediction is crucial from different points of view, the continuous search for more and more precise modeling approaches is encouraged. In light of this, the aim of the present work is to test the efficiency in predicting ET fluxes in a newly introduced physical-based model, named Prospero, which is based on the ability to compute the ET using a multi-layer canopy model, solving the energy balance both for the sunlight and shadow vegetation, extending the recently developed Schymanski and Or method to canopy level. Additionally, Prospero is able to compute the actual ET using a Jarvis-like model. The model is integrated as a component in the hydrological modelling system GEOframe. Its estimates were validated against observed data from five Eddy covariance (EC) sites with different climatic conditions and the same vegetation cover. Then, its performances were compared with those of two already consolidated models, the Priestley–Taylor model and Penman FAO model, using four goodness-of-fit indices. Subsequently a calibration of the three methods has been carried out using LUCA calibration within GEOframe, with the purpose of prediction errors. The results showed that Prospero is more accurate and precise with respect to the other two models, even if no calibrations were performed, with better performances in dry climatic conditions. In addition, Prospero model turned to be the least affected by the calibration procedure and, therefore, it can be effectively also used in a context of data scarcity

    Presentations & Posters

    No full text

    Presentations-Posters

    No full text

    BEYOND WATER, Infrastructural Prevention from Flooding

    No full text
    The aim of this work is to bring together a variety of data-sets including urban scale plans, data driven simulations and government regulations related to atflood- risk areas, in order to develop infrastructural plans and architectural responses to flooding. In the field of management and prevention of catastrophic events, the proposal is to create a telematic map of risk that brings together data on nature, the built environment, current policies and forecasting. We shall reassess the map production so that environment and risk belong equally to the definition of 'territory'. In considering the rate of change of cities and the continuous replacement of the housing stock, it is necessary to adopt operational plans that intervene in environmental and infrastructural planning, architectural project, and ICT systems design. Only an overall and simultaneous urban vision and management can bridge the current gaps between programming documents and executive plans. Hence the role of the architect is to connect the different layers of the city, in order to form strategic planning processes related to the contingencies of a constantly changing reality. The architecture of the city, as well as country planning, depends only on considering the territory as a whole, and maps as the platform on which it is possible to study future actions. The main aim of the Telematic Map of the Risk is to use digital techniques to rematerialize reality

    Students

    No full text
    • 

    corecore