1,871 research outputs found

    Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5

    Get PDF
    This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well

    Northeastern Illinois University, Academic Catalog 2023-2024

    Get PDF
    https://neiudc.neiu.edu/catalogs/1064/thumbnail.jp

    Hydrogen storage in depleted gas reservoirs: A comprehensive review

    Get PDF
    Hydrogen future depends on large-scale storage, which can be provided by geological formations (such as caverns, aquifers, and depleted oil and gas reservoirs) to handle demand and supply changes, a typical hysteresis of most renewable energy sources. Amongst them, depleted natural gas reservoirs are the most cost-effective and secure solutions due to their wide geographic distribution, proven surface facilities, and less ambiguous site evaluation. They also require less cushion gas as the native residual gases serve as a buffer for pressure maintenance during storage. However, there is a lack of thorough understanding of this technology. This work aims to provide a comprehensive insight and technical outlook into hydrogen storage in depleted gas reservoirs. It briefly discusses the operating and potential facilities, case studies, and the thermophysical and petrophysical properties of storage and withdrawal capacity, gas immobilization, and efficient gas containment. Furthermore, a comparative approach to hydrogen, methane, and carbon dioxide with respect to well integrity during gas storage has been highlighted. A summary of the key findings, challenges, and prospects has also been reported. Based on the review, hydrodynamics, geochemical, and microbial factors are the subsurface\u27s principal promoters of hydrogen losses. The injection strategy, reservoir features, quality, and operational parameters significantly impact gas storage in depleted reservoirs. Future works (experimental and simulation) were recommended to focus on the hydrodynamics and geomechanics aspects related to migration, mixing, and dispersion for improved recovery. Overall, this review provides a streamlined insight into hydrogen storage in depleted gas reservoirs

    Endogenous measures for contextualising large-scale social phenomena: a corpus-based method for mediated public discourse

    Get PDF
    This work presents an interdisciplinary methodology for developing endogenous measures of group membership through analysis of pervasive linguistic patterns in public discourse. Focusing on political discourse, this work critiques the conventional approach to the study of political participation, which is premised on decontextualised, exogenous measures to characterise groups. Considering the theoretical and empirical weaknesses of decontextualised approaches to large-scale social phenomena, this work suggests that contextualisation using endogenous measures might provide a complementary perspective to mitigate such weaknesses. This work develops a sociomaterial perspective on political participation in mediated discourse as affiliatory action performed through language. While the affiliatory function of language is often performed consciously (such as statements of identity), this work is concerned with unconscious features (such as patterns in lexis and grammar). This work argues that pervasive patterns in such features that emerge through socialisation are resistant to change and manipulation, and thus might serve as endogenous measures of sociopolitical contexts, and thus of groups. In terms of method, the work takes a corpus-based approach to the analysis of data from the Twitter messaging service whereby patterns in users’ speech are examined statistically in order to trace potential community membership. The method is applied in the US state of Michigan during the second half of 2018—6 November having been the date of midterm (i.e. non-Presidential) elections in the United States. The corpus is assembled from the original posts of 5,889 users, who are nominally geolocalised to 417 municipalities. These users are clustered according to pervasive language features. Comparing the linguistic clusters according to the municipalities they represent finds that there are regular sociodemographic differentials across clusters. This is understood as an indication of social structure, suggesting that endogenous measures derived from pervasive patterns in language may indeed offer a complementary, contextualised perspective on large-scale social phenomena

    Coupled point neutron kinetics and thermal-hydraulics models of transient nuclear criticality excursions in wetted fissile uranium dioxide (UO2) powders

    Get PDF
    This thesis describes a phenomenologically based mathematical and computational methodology for the simulation of a postulated transient nuclear criticality excursion initiated by the incursion of water, from a fire-sprinkler system, into a bed of dry UO2 powder. These potentially hazardous multi-phase dispersed particulate systems may form as a result of a fire or explosion in a nuclear fuel fabrication facility. The models proposed in this thesis aim to support nuclear criticality safety analysis and assessment. In addition, the development of these models aims to support emergency planning and preparedness. The point neutron kinetics equations are coupled to phenomenological models of water infiltration, sedimentation, fluidisation, nuclear thermal hydraulics, radiolysis and boiling, through the use of multivariate reactivity feedback components. The spatial and temporal solution of this set of equations enables the modelling of postulated transient nuclear criticality excursions in highly dispersed three-phase particulate systems of UO2 powder. The results indicate that there is the potential for large positive reactivities to be added to a UO2 powder system as pores become filled with water. Generally, thermal expansion and Doppler broadening are insufficient to control the transient, leading to significant radiolysis and boiling on the surface of the UO2 powder particles. Radiolytic gas and steam bubble induced fluidisation and sedimentation significantly alters the characteristics of a transient nuclear criticality excursion and should be carefully considered. Research has also been undertaken examining transient nuclear criticality excursions in weak intrinsic neutron source UO2 powder systems by solving the forward probability balance equation and using a Gamma probability distribution function to estimate mean wait-time probability distributions. Significant variations in the potential initial peak power are predicted for highly enriched, wetted, UO2 powders as a function of the stochastic behaviour associated with criticality excursions in low neutron population systems.Open Acces

    Data Rescue : defining a comprehensive workflow that includes the roles and responsibilities of the research library.

    Get PDF
    Thesis (PhD (Research))--University of Pretoria, 2023.This study, comprising a case study at a selected South African research institute, focused on the creation of a workflow model for data rescue indicating the roles and responsibilities of the research library. Additional outcomes of the study include a series of recommendations addressing the troublesome findings that revealed data at risk to be a prevalent reality at the selected institute, showing the presence of a multitude of factors putting data at risk, disclosing the profusion of data rescue obstacles faced by researchers, and uncovering that data rescue at the institute is rarely implemented. The study consists of four main parts: (i) a literature review, (ii) content analysis of literature resulting in the creation of a data rescue workflow model, (iii) empirical data collection methods , and (iv) the adaptation and revision of the initial data rescue model to present a recommended version of the model. A literature review was conducted and addressed data at risk and data rescue terminology, factors putting data at risk, the nature, diversity and prevalence of data rescue projects, and the rationale for data rescue. The second part of the study entailed the application of content analysis to selected documented data rescue workflows, guidelines and models. Findings of the analysis led to the identification of crucial components of data rescue and brought about the creation of an initial Data Rescue Workflow Model. As a first draft of the model, it was crucial that the model be reviewed by institutional research experts during the next main stage of the study. The section containing the study methodology culminates in the implementation of four different empirical data collection methods. Data collected via a web-based questionnaire distributed to a sample of research group leaders (RGLs), one-on-one virtual interviews with a sample of the aforementioned RGLs, feedback supplied by RGLs after reviewing the initial Data Rescue Workflow Model, and a focus group session held with institutional research library experts resulted in findings producing insight into the institute’s data at risk and the state of data rescue. Feedback supplied by RGLs after examining the initial Data Rescue Workflow Model produced a list of concerns linked to the model and contained suggestions for changes to the model. RGL feedback was at times unrelated to the model or to data and necessitated the implementation of a mini focus group session involving institutional research library experts. The mini focus group session comprised discussions around requirements for a data rescue workflow model. The consolidation of RGL feedback and feedback supplied by research library experts enabled the creation of a recommended Data Rescue Workflow Model, with the model also indicating the various roles and responsibilities of the research library. The contribution of this research lies primarily in the increase in theoretical knowledge regarding data at risk and data rescue, and culminates in the presentation of a recommended Data Rescue Workflow Model. The model not only portrays crucial data rescue activities and outputs, but also indicates the roles and responsibilities of a sector that can enhance and influence the prevalence and execution of data rescue projects. In addition, participation in data rescue and an understanding of the activities and steps portrayed via the model can contribute towards an increase in the skills base of the library and information services sector and enhance collaboration projects with relevant research sectors. It is also anticipated that the study recommendations and exposure to the model may influence the viewing and handling of data by researchers and accompanying research procedures.Information SciencePhD (Research)Unrestricte

    Machine Learning and Its Application to Reacting Flows

    Get PDF
    This open access book introduces and explains machine learning (ML) algorithms and techniques developed for statistical inferences on a complex process or system and their applications to simulations of chemically reacting turbulent flows. These two fields, ML and turbulent combustion, have large body of work and knowledge on their own, and this book brings them together and explain the complexities and challenges involved in applying ML techniques to simulate and study reacting flows. This is important as to the world’s total primary energy supply (TPES), since more than 90% of this supply is through combustion technologies and the non-negligible effects of combustion on environment. Although alternative technologies based on renewable energies are coming up, their shares for the TPES is are less than 5% currently and one needs a complete paradigm shift to replace combustion sources. Whether this is practical or not is entirely a different question, and an answer to this question depends on the respondent. However, a pragmatic analysis suggests that the combustion share to TPES is likely to be more than 70% even by 2070. Hence, it will be prudent to take advantage of ML techniques to improve combustion sciences and technologies so that efficient and “greener” combustion systems that are friendlier to the environment can be designed. The book covers the current state of the art in these two topics and outlines the challenges involved, merits and drawbacks of using ML for turbulent combustion simulations including avenues which can be explored to overcome the challenges. The required mathematical equations and backgrounds are discussed with ample references for readers to find further detail if they wish. This book is unique since there is not any book with similar coverage of topics, ranging from big data analysis and machine learning algorithm to their applications for combustion science and system design for energy generation

    Forecasting CO2 Sequestration with Enhanced Oil Recovery

    Get PDF
    The aim of carbon capture, utilization, and storage (CCUS) is to reduce the amount of CO2 released into the atmosphere and to mitigate its effects on climate change. Over the years, naturally occurring CO2 sources have been utilized in enhanced oil recovery (EOR) projects in the United States. This has presented an opportunity to supplement and gradually replace the high demand for natural CO2 sources with anthropogenic sources. There also exist incentives for operators to become involved in the storage of anthropogenic CO2 within partially depleted reservoirs, in addition to the incremental production oil revenues. These incentives include a wider availability of anthropogenic sources, the reduction of emissions to meet regulatory requirements, tax incentives in some jurisdictions, and favorable public relations. The United States Department of Energy has sponsored several Regional Carbon Sequestration Partnerships (RCSPs) through its Carbon Storage program which have conducted field demonstrations for both EOR and saline aquifer storage. Various research efforts have been made in the area of reservoir characterization, monitoring, verification and accounting, simulation, and risk assessment to ascertain long-term storage potential within the subject storage complex. This book is a collection of lessons learned through the RCSP program within the Southwest Region of the United States. The scope of the book includes site characterization, storage modeling, monitoring verification reporting (MRV), risk assessment and international case studies

    Modelling for the automatic assessment of rainfall triggered landslide susceptibility due to changes in groundwater level and soil water content.

    Get PDF
    Risk assessment of rain-triggered landslides over large areas is quite challenging due to the complexity of the phenomenon. In fact, rainfall represents one of the most important triggering factors for landslides performing an erosive action at ground level, and, through deep infiltration, increasing the soil saturation degree and feeding the groundwater table leading to fluctuations that can affect the slope stability. These phenomena represent an open challenge for technicians and authorities involved in landslide risk management and mitigation. For this reason, it is necessary to develop appropriate models for the landslides susceptibility assessment that are operationally compatible with good resolution and computational speed. Standard methods of 3D slope stability analysis are generally applied over limited areas or at low resolution. In this dissertation, two automatic procedures are proposed for estimating landslide susceptibility induced by changes in (i) groundwater levels and (ii) soil saturation conditions. A physically based Integrated Hydrological and Geotechnical (IHG) model was implemented in GIS environment to effectively analyse areas of a few square kilometres, typically at a scale of 1:5.000. Referring to each volume element in which the whole mass under study is discretized, a simplified hydrological soil-water balance and geotechnical modelling are applied in order to assess the debris and earth slide susceptibility in occasion of measured or forecasted rainfalls. The IHG procedure allows 3D modelling of landslide areas, both morphologically and with regard to geotechnical/hydrological parameters thanks to the spatialisation of input data from in situ measurements, and renders easy-to-understand results. Critical issues inherent the discretization of quite large areas, referred to soil characterization, interpolation/extrapolation of in situ measurements, spatial resolution and computational effort, are here discussed. Considering rain-triggered shallow landslides, the stability can be markedly influenced by the propagation of the saturation front inside the unsaturated zone. Soil shear strength varies in the vadose zone depending on the type of soil and the variations of soil moisture. Monitoring of the unsaturated zone can be done by measuring volumetric water content using low-cost instrumentation (i.e. capacitive sensors) that are easy to manage and provide data in near-real time. For a proper soil moisture assessment a laboratory soil-specific calibration of the sensors is recommended. Knowing the soil water content, the suction parameter can be estimated by a Water Retention Curve (WRC), and consequently the soil shear strength in unsaturated conditions is evaluated. The automatic procedure developed in GIS environment, named assessment of Soil Apparent Cohesion (SAC), here described, allows the estimate of the soil shear strength starting from soil moisture monitoring data (from sensor networks or satellite-derived map). SAC results can be integrated into existing models for landslide susceptibility assessment and also for the emergency management. Some significant results concerning the automatic IHG and SAC procedures, implemented in Python, applied to landslides within the Alcotra AD-VITAM project are here presented
    corecore