384 research outputs found

    Design and Implementation of Machine Learning Models and Algorithms for Flood, Drought and Frazil Prediction

    Get PDF
    Natural calamities like floods and droughts pose a significant threat to humanity, impacting millions of people each year and incurring substantial economic losses to society. In response to this challenge, this thesis focuses on developing advanced machine learning techniques to improve water height prediction accuracy that can aid municipalities in effective flood mitigation. The primary objective of this study is to evaluate an innovative architecture that leverages Long Short Term Networks - neural networks to predict water height accurately in three different environmental scenarios, i.e., frazil, droughts and floods due to snow spring melt. A distinguishing feature of our approach is the incorporation of meteorological forecast as an input parameter into the prediction model. By modeling the intricate relationships between water level data, historical meteorological data and meteorological forecasts, we seek to evaluate the impact of meteorological forecasts and if any inaccuracies could impact water-level prediction. We compare the outcomes obtained by incorporating next-hour, next-day and next-week meteorological data into our novel LSTM model. Our results indicate a comprehensive comparison of the usage of various parameters as input and our findings suggest that accurate weather forecasts are crucial in achieving reliable water height predictions. Additionally, this study focuses on the utilization of IoT sensor data in combination with ML models to enhance the effectiveness of flood prediction and management. We present an online machine learning approach that performs online training of the model using real-time data from IoT sensors. The integration of live sensor data provides a dynamic and adaptive system that demonstrates superior predictive capabilities compared to traditional static models. By adopting these advanced techniques, we can mitigate the adverse impacts of natural catastrophes and work towards building more resilient and disaster-resistant communities

    Identification of Enhanced Rainfall Rates Using the Near-Storm Environment for Radar Precipitation Estimates

    Get PDF
    Reliable and timely flash flood warnings are critically dependent on the accuracy of real-time rainfall estimates. Precipitation is not only the most vital input for basin-scale accumulation algorithms such as the Flash Flood Monitoring and Prediction (FFMP) program used operationally by the U.S. National Weather Service, but it is the primary forcing for hydrologic models at all scales. Quantitative precipitation estimates (QPE) from radar are widely used for such a purpose due to their high spatial and temporal resolution compared to rain gauges and satellite-based algorithms. However, converting the native radar variables into an instantaneous rain rate is fraught with uncertainties.One of those uncertainties is the varying relationship of radar observables to rain rate for different regions and storm types due to variations in drop size distributions. Many unique reflectivity-to-rain rate (Z-R) functions have been proposed in the literature over the past 70 years for single-polarization radars, and it is becoming apparent that various rain rate functions will also be needed in different environments for dual-polarization radars as well. The challenge then becomes identifying the environments in real-time such that the appropriate rain rate function can be applied. This study addresses the challenge of identifying environments conducive for tropical rain rates, or rain rates that are enhanced by highly productive warm rain processes. Rain rates in tropical environments tend to be underestimated by other operational Z-R functions and have often been associated with historic flash flooding events, so delineating them in real-time can greatly improve not only the radar-based QPE accuracy, but the level of certainty by forecasters for issuing flash flood warnings as well.Six consecutive months of hourly data from the 2010 warm season were used to train ensembles of statistical classification models such that probabilities of warm rain enhancement of rain rate can be derived. The predictors for the ensemble were retrieved from the 20-km Rapid Update Cycle (RUC) model analyses and were chosen to provide a general description of the thermodynamic environment from the which the rainfall developed. Those environmental predictors were trained against two different predictands: bias of rain rates for the convective Z-R function vs. collocated, quality controlled rain gauges, and the vertical gradient of radar reflectivity between the freezing level and the lowest elevation observed by the radar. The resulting probabilities from the trained ensembles were then used to delineate where tropical rain rates would be assigned in a gridded QPE product, and the resulting hourly accumulations were verified against independent rain gauges.Overall, the probability-based precipitation type delineation scheme improved hourly rainfall accumulations for three independent cases tested when compared to both the legacy rainfall product from the National Mosaic and Multisensor Quantitative Precipitation Estimation (NMQ) project and the operational NWS rainfall product (Stage II), but neither the gauge-based nor VPR-based ensembles emerged as a clearly superior predictor than the other for all cases tested. However, spatial similarities between the two probability fields and similar results from variable importance analysis suggest that both methods are attempting to delineate the same environment. This implies that the systematic underestimation of radar-based QPE and the enhancement of reflectivity in the warm layer from warm rain hydrometeor growth are related or at the very least are associated with the same type of environment. Initial analysis of polarimetric variables, particularly differential reflectivity, in areas of high and low probabilities also support a connection between rain rate underestimation and tropical airmasses

    Progress in operational modeling in support of oil spill response

    Get PDF
    Following the 2010 Deepwater Horizon accident of a massive blow-out in the Gulf of Mexico, scientists from government, industry, and academia collaborated to advance oil spill modeling and share best practices in model algorithms, parameterizations, and application protocols. This synergy was greatly enhanced by research funded under the Gulf of Mexico Research Initiative (GoMRI), a 10-year enterprise that allowed unprecedented collection of observations and data products, novel experiments, and international collaborations that focused on the Gulf of Mexico, but resulted in the generation of scientific findings and tools of broader value. Operational oil spill modeling greatly benefited from research during the GoMRI decade. This paper provides a comprehensive synthesis of the related scientific advances, remaining challenges, and future outlook. Two main modeling components are discussed: Ocean circulation and oil spill models, to provide details on all attributes that contribute to the success and limitations of the integrated oil spill forecasts. These forecasts are discussed in tandem with uncertainty factors and methods to mitigate them. The paper focuses on operational aspects of oil spill modeling and forecasting, including examples of international operational center practices, observational needs, communication protocols, and promising new methodologies

    Confronting the Challenge of Modeling Cloud and Precipitation Microphysics

    Get PDF
    In the atmosphere, microphysics refers to the microscale processes that affect cloud and precipitation particles and is a key linkage among the various components of Earth\u27s atmospheric water and energy cycles. The representation of microphysical processes in models continues to pose a major challenge leading to uncertainty in numerical weather forecasts and climate simulations. In this paper, the problem of treating microphysics in models is divided into two parts: (i) how to represent the population of cloud and precipitation particles, given the impossibility of simulating all particles individually within a cloud, and (ii) uncertainties in the microphysical process rates owing to fundamental gaps in knowledge of cloud physics. The recently developed Lagrangian particle‐based method is advocated as a way to address several conceptual and practical challenges of representing particle populations using traditional bulk and bin microphysics parameterization schemes. For addressing critical gaps in cloud physics knowledge, sustained investment for observational advances from laboratory experiments, new probe development, and next‐generation instruments in space is needed. Greater emphasis on laboratory work, which has apparently declined over the past several decades relative to other areas of cloud physics research, is argued to be an essential ingredient for improving process‐level understanding. More systematic use of natural cloud and precipitation observations to constrain microphysics schemes is also advocated. Because it is generally difficult to quantify individual microphysical process rates from these observations directly, this presents an inverse problem that can be viewed from the standpoint of Bayesian statistics. Following this idea, a probabilistic framework is proposed that combines elements from statistical and physical modeling. Besides providing rigorous constraint of schemes, there is an added benefit of quantifying uncertainty systematically. Finally, a broader hierarchical approach is proposed to accelerate improvements in microphysics schemes, leveraging the advances described in this paper related to process modeling (using Lagrangian particle‐based schemes), laboratory experimentation, cloud and precipitation observations, and statistical methods

    Remote Sensing of Natural Hazards

    Get PDF
    Each year, natural hazards such as earthquakes, cyclones, flooding, landslides, wildfires, avalanches, volcanic eruption, extreme temperatures, storm surges, drought, etc., result in widespread loss of life, livelihood, and critical infrastructure globally. With the unprecedented growth of the human population, largescale development activities, and changes to the natural environment, the frequency and intensity of extreme natural events and consequent impacts are expected to increase in the future.Technological interventions provide essential provisions for the prevention and mitigation of natural hazards. The data obtained through remote sensing systems with varied spatial, spectral, and temporal resolutions particularly provide prospects for furthering knowledge on spatiotemporal patterns and forecasting of natural hazards. The collection of data using earth observation systems has been valuable for alleviating the adverse effects of natural hazards, especially with their near real-time capabilities for tracking extreme natural events. Remote sensing systems from different platforms also serve as an important decision-support tool for devising response strategies, coordinating rescue operations, and making damage and loss estimations.With these in mind, this book seeks original contributions to the advanced applications of remote sensing and geographic information systems (GIS) techniques in understanding various dimensions of natural hazards through new theory, data products, and robust approaches

    Sustainable Reservoir Management Approaches under Impacts of Climate Change - A Case Study of Mangla Reservoir, Pakistan

    Get PDF
    Reservoir sedimentation is a major issue for water resource management around the world. It has serious economic, environmental, and social consequences, such as reduced water storage capacity, increased flooding risk, decreased hydropower generation, and deteriorated water quality. Increased rainfall intensity, higher temperatures, and more extreme weather events due to climate change are expected to exacerbate the problem of reservoir sedimentation. As a result, sedimentation must be managed to ensure the long-term viability of reservoirs and their associated infrastructure. Effective reservoir sedimentation management in the face of climate change necessitates an understanding of the sedimentation process and the factors that influence it, such as land use practices, erosion, and climate. Monitoring and modelling sedimentation rates are also useful tools for forecasting future impacts and making management decisions. The goal of this research is to create long-term reservoir management strategies in the face of climate change by simulating the effects of various reservoir-operating strategies on reservoir sedimentation and sediment delta movement at Mangla Reservoir in Pakistan (the second-largest dam in the country). In order to assess the impact of the Mangla Reservoir's sedimentation and reservoir life, a framework was developed. This framework incorporates both hydrological and morphodynamic models and various soft computing models. In addition to taking climate change uncertainty into consideration, the proposed framework also incorporates sediment source, sediment delivery, and reservoir morphology changes. Furthermore, the purpose of this study is to provide a practical methodology based on the limited data available. In the first phase of this study, it was investigated how to accurately quantify the missing suspended sediment load (SSL) data in rivers by utilizing various techniques, such as sediment rating curves (SRC) and soft computing models (SCMs), including local linear regression (LLR), artificial neural networks (ANN) and wavelet-cum-ANN (WANN). Further, the Gamma and M-test were performed to select the best-input variables and appropriate data length for SCMs development. Based on an evaluation of the outcomes of all leading models for SSL estimation, it can be concluded that SCMs are more effective than SRC approaches. Additionally, the results also indicated that the WANN model was the most accurate model for reconstructing the SSL time series because it is capable of identifying the salient characteristics in a data series. The second phase of this study examined the feasibility of using four satellite precipitation datasets (SPDs) which included GPM, PERSIANN_CDR, CHIRPS, and CMORPH to predict streamflow and sediment loads (SL) within a poorly gauged mountainous catchment, by employing the SWAT hydrological model as well as SWAT coupled soft computing models (SCMs) such as artificial neural networks (SWAT-ANN), random forests (SWAT-RF), and support vector regression (SWAT-SVR). SCMs were developed using the outputs of un-calibrated SWAT hydrological models to improve the predictions. The results indicate that during the entire simulation, the GPM shows the best performance in both schemes, while PERSIAN_CDR and CHIRPS also perform well, whereas CMORPH predicts streamflow for the Upper Jhelum River Basin (UJRB) with relatively poor performance. Among the best GPM-based models, SWAT-RF offered the best performance to simulate the entire streamflow, while SWAT-ANN excelled at simulating the SL. Hence, hydrological coupled SCMs based on SPDs could be an effective technique for simulating streamflow and SL, particularly in complex terrain where gauge network density is low or uneven. The third and last phase of this study investigated the impact of different reservoir operating strategies on Mangla reservoir sedimentation using a 1D sediment transport model. To improve the accuracy of the model, more accurate boundary conditions for flow and sediment load were incorporated into the numerical model (derived from the first and second phases of this study) so that the successive morphodynamic model could precisely predict bed level changes under given climate conditions. Further, in order to assess the long-term effect of a changing climate, a Global Climate Model (GCM) under Representative Concentration Pathways (RCP) scenarios 4.5 and 8.5 for the 21st century is used. The long-term modelling results showed that a gradual increase in the reservoir minimum operating level (MOL) slows down the delta movement rate and the bed level close to the dam. However, it may compromise the downstream irrigation demand during periods of high water demand. The findings may help the reservoir managers to improve the reservoir operation rules and ultimately support the objective of sustainable reservoir use for societal benefit. In summary, this study provides comprehensive insights into reservoir sedimentation phenomena and recommends an operational strategy that is both feasible and sustainable over the long term under the impact of climate change, especially in cases where a lack of data exists. Basically, it is very important to improve the accuracy of sediment load estimates, which are essential in the design and operation of reservoir structures and operating plans in response to incoming sediment loads, ensuring accurate reservoir lifespan predictions. Furthermore, the production of highly accurate streamflow forecasts, particularly when on-site data is limited, is important and can be achieved by the use of satellite-based precipitation data in conjunction with hydrological and soft computing models. Ultimately, the use of soft computing methods produces significantly improved input data for sediment load and discharge, enabling the application of one-dimensional hydro-morphodynamic numerical models to evaluate sediment dynamics and reservoir useful life under the influence of climate change at various operating conditions in a way that is adequate for evaluating sediment dynamics.:Chapter 1: Introduction Chapter 2:Reconstruction of Sediment Load Data in Rivers Chapter 3:Assessment of The Hydrological and Coupled Soft Computing Models, Based on Different Satellite Precipitation Datasets, To Simulate Streamflow and Sediment Load in A Mountainous Catchment Chapter 4:Simulating the Impact of Climate Change with Different Reservoir Operating Strategies on Sedimentation of the Mangla Reservoir, Northern Pakistan Chapter 5:Conclusions and Recommendation

    COBE's search for structure in the Big Bang

    Get PDF
    The launch of Cosmic Background Explorer (COBE) and the definition of Earth Observing System (EOS) are two of the major events at NASA-Goddard. The three experiments contained in COBE (Differential Microwave Radiometer (DMR), Far Infrared Absolute Spectrophotometer (FIRAS), and Diffuse Infrared Background Experiment (DIRBE)) are very important in measuring the big bang. DMR measures the isotropy of the cosmic background (direction of the radiation). FIRAS looks at the spectrum over the whole sky, searching for deviations, and DIRBE operates in the infrared part of the spectrum gathering evidence of the earliest galaxy formation. By special techniques, the radiation coming from the solar system will be distinguished from that of extragalactic origin. Unique graphics will be used to represent the temperature of the emitting material. A cosmic event will be modeled of such importance that it will affect cosmological theory for generations to come. EOS will monitor changes in the Earth's geophysics during a whole solar color cycle

    Development of numerical methodologies to predict the liquid fuel sprays - wall interaction to optimize the mixing process of direct injection spark ignition engines

    Get PDF
    Nowadays the development of new Internal Combustion Engines is mainly driven by the need to reduce tailpipe emissions of pollutants, Green-House Gases and avoid the fossil fuels wasting. The design of dimension and shape of the combustion chamber together with the implementation of different injection strategies e.g., injection timing, spray targeting, higher injection pressure, play a key role in the accomplishment of the aforementioned targets. As far as the match between the fuel injection and evaporation and the combustion chamber shape is concerned, the assessment of the interaction between the liquid fuel spray and the engine walls in gasoline direct injection engines is crucial. The use of numerical simulations is an acknowledged technique to support the study of new technological solutions such as the design of new gasoline blends and of tailored injection strategies to pursue the target mixture formation. The current simulation framework lacks a well-defined best practice for the liquid fuel spray interaction simulation, which is a complex multi-physics problem. This thesis deals with the development of robust methodologies to approach the numerical simulation of the liquid fuel spray interaction with walls and lubricants. The accomplishment of this task was divided into three tasks: i) setup and validation of spray-wall impingement three-dimensional CFD spray simulations; ii) development of a one-dimensional model describing the liquid fuel – lubricant oil interaction; iii) development of a machine learning based algorithm aimed to define which mixture of known pure components mimics the physical behaviour of the real gasoline for the simulation of the liquid fuel spray interaction

    Machine Learning and Its Application to Reacting Flows

    Get PDF
    This open access book introduces and explains machine learning (ML) algorithms and techniques developed for statistical inferences on a complex process or system and their applications to simulations of chemically reacting turbulent flows. These two fields, ML and turbulent combustion, have large body of work and knowledge on their own, and this book brings them together and explain the complexities and challenges involved in applying ML techniques to simulate and study reacting flows. This is important as to the world’s total primary energy supply (TPES), since more than 90% of this supply is through combustion technologies and the non-negligible effects of combustion on environment. Although alternative technologies based on renewable energies are coming up, their shares for the TPES is are less than 5% currently and one needs a complete paradigm shift to replace combustion sources. Whether this is practical or not is entirely a different question, and an answer to this question depends on the respondent. However, a pragmatic analysis suggests that the combustion share to TPES is likely to be more than 70% even by 2070. Hence, it will be prudent to take advantage of ML techniques to improve combustion sciences and technologies so that efficient and “greener” combustion systems that are friendlier to the environment can be designed. The book covers the current state of the art in these two topics and outlines the challenges involved, merits and drawbacks of using ML for turbulent combustion simulations including avenues which can be explored to overcome the challenges. The required mathematical equations and backgrounds are discussed with ample references for readers to find further detail if they wish. This book is unique since there is not any book with similar coverage of topics, ranging from big data analysis and machine learning algorithm to their applications for combustion science and system design for energy generation
    corecore