42 research outputs found

    Reduced Order Models and Data Assimilation for Hydrological Applications

    Get PDF
    The present thesis work concerns the study of Monte Carlo (MC)-based data assimilation methods applied to the numerical simulation of complex hydrological models with stochastic parameters. The ensemble Kalman filter (EnKF) and the sequential importance resampling (SIR) are implemented in the CATHY model, a solver that couples the subsurface water flow in porous media with the surface water dynamics. A detailed comparison of the results given by the two filters in a synthetic test case highlights the main benefits and drawbacks associated to these techniques. A modification of the SIR update is suggested to improve the performance of the filter in case of small ensemble sizes and small variances of the measurement errors. With this modification, both filters are able to assimilate pressure head and streamflow measurements and correct model errors, such as biased initial and boundary conditions. SIR technique seems to be better suited for the simulations at hand as they do not make use of the Gaussian approximation inherent the EnKF method. Further research is needed, however, to assess the robustness of the particle filters methods in particular to ensure accuracy of the results even when relatively small ensemble sizes are employed. In the second part of the thesis the focus is shifted to reducing the computational burden associated with the construction of the MC realizations (which constitutes the core of the EnKF and SIR). With this goal, we analyze the computational saving associated to the use of reduced order models (RM) for the generation of the ensemble of solutions. The proper orthogonal decomposition (POD) is applied to the linear equations of the groundwater flow in saturated porous media with a randomly distributed recharge and random heterogeneous hydraulic conductivity. Several test cases are used to assess the errors on the ensemble statistics caused by the RM approximation. Particular attention is given to the efficient computation of the principal components that are needed to project the model equations in the reduced space. The greedy algorithm selects the snapshots in the set of the MC realizations in such a way that the final principal components are parameter independent. An innovative residual-based estimation of the error associated to the RM solution is used to assess the precision of the RM and to stop the iterations of the greedy algorithm. By way of numerical applications in synthetic and real scenarios, we demonstrate that this modified greedy algorithm determines the minimum number of principal components to use in the reduction and, thus, leads to important computational savings

    Examination of the seepage face boundary condition in subsurface and coupled surface/subsurface hydrological models

    Get PDF
    A seepage face is a nonlinear dynamic boundary that strongly affects pressure head distributions, water table fluctuations, and flow patterns. Its handling in hydrological models, especially under complex conditions such as heterogeneity and coupled surface/subsurface flow, has not been extensively studied. In this paper, we compare the treatment of the seepage face as a static (Dirichlet) versus dynamic boundary condition, we assess its resolution under conditions of layered heterogeneity, we examine its interaction with a catchment outlet boundary, and we investigate the effects of surface/subsurface exchanges on seepage faces forming at the land surface. The analyses are carried out with an integrated catchment hydrological model. Numerical simulations are performed for a synthetic rectangular sloping aquifer and for an experimental hillslope from the Landscape Evolution Observatory. The results show that the static boundary condition is not always an adequate stand-in for a dynamic seepage face boundary condition, especially under conditions of high rainfall, steep slope, or heterogeneity; that hillslopes with layered heterogeneity give rise to multiple seepage faces that can be highly dynamic; that seepage face and outlet boundaries can coexist in an integrated hydrological model and both play an important role; and that seepage faces at the land surface are not always controlled by subsurface flow. The paper also presents a generalized algorithm for resolving seepage face outflow that handles heterogeneity in a simple way, is applicable to unstructured grids, and is shown experimentally to be equivalent to the treatment of atmospheric boundary conditions in subsurface flow models

    How the interpretation of drivers' behavior in virtual environment can become a road design tool: a case study

    Get PDF
    Driving is the result of a psychological process that translates data, signals and direct/indirect messages into behavior, which is continuously adapted to the exchange of varying stimuli between man, environment and vehicle. These stimuli are at times not perceived and at others perceived but not understood by the driver, even if they derive from tools (vertical signs, horizontal marking) specifically conceived for his safety. The result is unsafe behavior of vehicle drivers. For this reason, the road environment needs to be radically redesigned. The paper describes a research, based on real and virtual environment surveys, aimed to better understand drivers' action-reaction mechanisms inside different scenarios, in order to gain informations useful for a correct organization (design) of the road space. The driving simulator can help in developing, from road to laboratory, the study of new road design tools (geometrical, compositional, constructive ones, street furniture, etc.), because it can be used to evaluate solutions before their usefulness is proved on the road

    Spatially explicit effective reproduction numbers from incidence and mobility data

    Get PDF
    Current methods for near real-time estimation of effective reproduction numbers from surveillance data overlook mobility fluxes of infectors and susceptible individuals within a spatially connected network (the metapopulation). Exchanges of infections among different communities may thus be misrepresented unless explicitly measured and accounted for in the renewal equations. Here, we first derive the equations that include spatially explicit effective reproduction numbers, ℛk(t), in an arbitrary community k. These equations embed a suitable connection matrix blending mobility among connected communities and mobility-related containment measures. Then, we propose a tool to estimate, in a Bayesian framework involving particle filtering, the values of ℛk(t) maximizing a suitable likelihood function reproducing observed patterns of infections in space and time. We validate our tools against synthetic data and apply them to real COVID-19 epidemiological records in a severely affected and carefully monitored Italian region. Differences arising between connected and disconnected reproduction numbers (the latter being calculated with existing methods, to which our formulation reduces by setting mobility to zero) suggest that current standards may be improved in their estimation of disease transmission over time

    A stratified compartmental model for the transmission of Sparicotyle chrysophrii (Platyhelminthes: Monogenea) in gilthead seabream (Sparus aurata) fish farms

    Get PDF
    The rapid development of intensive fish farming has been associated with the spreading of infectious diseases, pathogens and parasites. One such parasite is Sparicotyle chrysophrii (Platyhelminthes: Monogenea), which commonly infects cultured gilthead seabream (Sparus aurata)—a vital species in Mediterranean aquaculture. The parasite attaches to fish gills and can cause epizootics in sea cages with relevant consequences for fish health and associated economic losses for fish farmers. In this study, a novel stratified compartmental epidemiological model of S. chrysophrii transmission was developed and analysed. The model accounts for the temporal progression of the number of juvenile and adult parasites attached to each fish, as well as the abundance of eggs and oncomiracidia. We applied the model to data collected in a seabream farm, where the fish population and the number of adult parasites attached to fish gills were closely monitored in six different cages for 10 months. The model successfully replicated the temporal dynamics of the distribution of the parasite abundance within fish hosts and simulated the effects of environmental factors, such as water temperature, on the transmission dynamics. The findings highlight the potential of modelling tools for farming management, aiding in the prevention and control of S. chrysophrii infections in Mediterranean aquaculture

    Near real-time forecasting for cholera decision making in Haiti after Hurricane Matthew

    Get PDF
    Computational models of cholera transmission can provide objective insights into the course of an ongoing epidemic and aid decision making on allocation of health care resources. However, models are typically designed, calibrated and interpreted post-hoc. Here, we report the efforts of a team from academia, field research and humanitarian organizations to model in near real-time the Haitian cholera outbreak after Hurricane Matthew in October 2016, to assess risk and to quantitatively estimate the efficacy of a then ongoing vaccination campaign. A rainfall-driven, spatially-explicit meta-community model of cholera transmission was coupled to a data assimilation scheme for computing short-term projections of the epidemic in near real-time. The model was used to forecast cholera incidence for the months after the passage of the hurricane (October-December 2016) and to predict the impact of a planned oral cholera vaccination campaign. Our first projection, from October 29 to December 31, predicted the highest incidence in the departments of Grande Anse and Sud, accounting for about 45% of the total cases in Haiti. The projection included a second peak in cholera incidence in early December largely driven by heavy rainfall forecasts, confirming the urgency for rapid intervention. A second projection (from November 12 to December 31) used updated rainfall forecasts to estimate that 835 cases would be averted by vaccinations in Grande Anse (90% Prediction Interval [PI] 476-1284) and 995 in Sud (90% PI 508-2043). The experience gained by this modeling effort shows that state-of-the-art computational modeling and data-assimilation methods can produce informative near real-time projections of cholera incidence. Collaboration among modelers and field epidemiologists is indispensable to gain fast access to field data and to translate model results into operational recommendations for emergency management during an outbreak. Future efforts should thus draw together multi-disciplinary teams to ensure model outputs are appropriately based, interpreted and communicated

    Achieving coordinated national immunity and cholera elimination in Haiti through vaccination: a modelling study

    Get PDF
    Summary: Background: Cholera was introduced into Haiti in 2010. Since then, more than 820 000 cases and nearly 10 000 deaths have been reported. Oral cholera vaccine (OCV) is safe and effective, but has not been seen as a primary tool for cholera elimination due to a limited period of protection and constrained supplies. Regionally, epidemic cholera is contained to the island of Hispaniola, and the lowest numbers of cases since the epidemic began were reported in 2019. Hence, Haiti may represent a unique opportunity to eliminate cholera with OCV. Methods: In this modelling study, we assessed the probability of elimination, time to elimination, and percentage of cases averted with OCV campaign scenarios in Haiti through simulations from four modelling teams. For a 10-year period from January 19, 2019, to Jan 13, 2029, we compared a no vaccination scenario with five OCV campaign scenarios that differed in geographical scope, coverage, and rollout duration. Teams used weekly department-level reports of suspected cholera cases from the Haiti Ministry of Public Health and Population to calibrate the models and used common vaccine-related assumptions, but other model features were determined independently. Findings: Among campaigns with the same vaccination coverage (70% fully vaccinated), the median probability of elimination after 5 years was 0–18% for no vaccination, 0–33% for 2-year campaigns focused in the two departments with the highest historical incidence, 0–72% for three-department campaigns, and 35–100% for nationwide campaigns. Two-department campaigns averted a median of 12–58% of infections, three-department campaigns averted 29–80% of infections, and national campaigns averted 58–95% of infections. Extending the national campaign to a 5-year rollout (compared to a 2-year rollout), reduced the probability of elimination to 0–95% and the proportion of cases averted to 37–86%. Interpretation: Models suggest that the probability of achieving zero transmission of Vibrio cholerae in Haiti with current methods of control is low, and that bolder action is needed to promote elimination of cholera from the region. Large-scale cholera vaccination campaigns in Haiti would offer the opportunity to synchronise nationwide immunity, providing near-term population protection while improvements to water and sanitation promote long-term cholera elimination. Funding: Bill & Melinda Gates Foundation, Global Good Fund, Institute for Disease Modeling, Swiss National Science Foundation, and US National Institutes of Health

    Reduced Order Models and Data Assimilation for Hydrological Applications

    Get PDF
    The present thesis work concerns the study of Monte Carlo (MC)-based data assimilation methods applied to the numerical simulation of complex hydrological models with stochastic parameters. The ensemble Kalman filter (EnKF) and the sequential importance resampling (SIR) are implemented in the CATHY model, a solver that couples the subsurface water flow in porous media with the surface water dynamics. A detailed comparison of the results given by the two filters in a synthetic test case highlights the main benefits and drawbacks associated to these techniques. A modification of the SIR update is suggested to improve the performance of the filter in case of small ensemble sizes and small variances of the measurement errors. With this modification, both filters are able to assimilate pressure head and streamflow measurements and correct model errors, such as biased initial and boundary conditions. SIR technique seems to be better suited for the simulations at hand as they do not make use of the Gaussian approximation inherent the EnKF method. Further research is needed, however, to assess the robustness of the particle filters methods in particular to ensure accuracy of the results even when relatively small ensemble sizes are employed. In the second part of the thesis the focus is shifted to reducing the computational burden associated with the construction of the MC realizations (which constitutes the core of the EnKF and SIR). With this goal, we analyze the computational saving associated to the use of reduced order models (RM) for the generation of the ensemble of solutions. The proper orthogonal decomposition (POD) is applied to the linear equations of the groundwater flow in saturated porous media with a randomly distributed recharge and random heterogeneous hydraulic conductivity. Several test cases are used to assess the errors on the ensemble statistics caused by the RM approximation. Particular attention is given to the efficient computation of the principal components that are needed to project the model equations in the reduced space. The greedy algorithm selects the snapshots in the set of the MC realizations in such a way that the final principal components are parameter independent. An innovative residual-based estimation of the error associated to the RM solution is used to assess the precision of the RM and to stop the iterations of the greedy algorithm. By way of numerical applications in synthetic and real scenarios, we demonstrate that this modified greedy algorithm determines the minimum number of principal components to use in the reduction and, thus, leads to important computational savings.Questo lavoro di tesi riguarda lo studio di tecniche di assimilazione di dati basate sul metodo di Monte Carlo (MC) per la simulazione numerica di modelli idrologici in presenza di parametri stocastici. I metodi ensemble Kalman filter (EnKF) e sequential importance resampling (SIR) sono implementati nel modello CATHY, un modello idrologico che accoppia il flusso d'acqua sotterraneo in mezzi porosi con la dinamica del flusso d’acqua superficiale. Il confronto dettagliato dei risultati ottenuti con i due filtri in un caso test sintetico evidenzia i principali vantaggi e inconvenienti associati a queste tecniche. Per migliorare le prestazioni del metodo SIR, in questa tesi è proposta una modifica del passo di update che risulta fondamentale nei casi in cui si usi un ensemble di dimensioni ridotte e la varianza associata all'errore di misura sia piccola. Grazie a questa modifica, entrambi i filtri sono in grado di assimilare misure di carico piezometrico e portata, riducendo la propagazione temporale di errori di modellizzazione dovuti, ad esempio, all'utilizzo di condizioni iniziali o al contorno distorte. La tecnica SIR sembra essere più adeguata dell'EnKF per l’applicazione ai casi test presentati. Si dimostra infatti che l'ipotesi di Gaussianità, che contraddistingue il metodo EnKF, non è soddisfatta in questi casi test, rendendo preferibili metodi più generali come il SIR. Ulteriori approfondimenti sono comunque necessari per stabilire l'affidabilità dei metodi di tipo particle filter, in particolare per garantire l'accuratezza del filtro SIR anche quando viene usato un numero relativamente piccolo di realizzazioni. Siccome il passo di previsione dei metodi SIR ed EnKF è basato sul metodo di MC, la seconda parte della tesi riguarda il problema di ridurre gli onerosi tempi di calcolo associati alla costruzione delle realizzazioni di MC. Con questo obbiettivo, si analizza il risparmio in tempo computazione ottenuto dall'uso di modelli di ordine ridotto (RM) per la generazione dell'ensemble delle soluzioni. La tecnica proper orthogonal decomposition (POD) è applicata alle equazioni lineari del flusso d’acqua sotterraneo in mezzi porosi saturi con ricarica stocastica e distribuita spazialmente, oppure con conducibilità idraulica stocastica e descritta per zone. Gli errori di approssimazione introdotti dal modello ridotto sul calcolo delle singole realizzazioni di MC e sulle corrispondenti statistiche sono analizzati in diversi casi test al variare della distribuzione probabilistica dei parametri stocastici. Particolare attenzione è dedicata alla procedura di calcolo delle principal components che sono necessarie per la proiezione delle equazioni del modello nello spazio ridotto. Il greedy algorithm seleziona gli snapshots tra le realizzazioni di MC considerate, facendo in modo che le principal components finali siano indipendenti dalla particolare realizzazione dei parametri stocastici. Infine, viene introdotta una stima innovativa della norma dell'errore associato alla soluzione del modello ridotto. Tale stima, basata sul calcolo del residuo, è di fondamentale importanza per stimare la precisione del RM e, quindi, inferire sul numero di principal components da usare nella riduzione. Le applicazioni numeriche effettuate su casi test sintetici e reali dimostrano che il greedy algorithm così modificato determina un numero minore di principal components rispetto al metodo tradizionale, pur mantenendo la medesima accuratezza

    Realizzazione di una linea industriale di confezionamento dolci mediante visione ottica e robot pick and place

    No full text
    Questa tesi è l' esposizione del lavoro di tirocinio svolto presso l' azienda Tech.PA S.p.A. di Verona. Nella seguente tesi verrà illustrato un sistema di confezionamento prodotti realizzato tramite un sistema di visione ABB (pickmaster), 4 robot delta ABB IRB360 FlexPicker per la presa del prodotto ed un robot ABB IRB4600 per la "paletizzazione" delle scatole confezionate. Verranno prese in esame le difficoltà di realizzazione di un sistema di visione ottico idoneo per i vari formati dei prodotti e la difficoltà di gestione di un sistema a più robotopenEmbargo per motivi di segretezza e di proprietà dei risultati e informazioni sensibil
    corecore