1,245 research outputs found

    Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5

    Get PDF
    This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well

    Beam scanning by liquid-crystal biasing in a modified SIW structure

    Get PDF
    A fixed-frequency beam-scanning 1D antenna based on Liquid Crystals (LCs) is designed for application in 2D scanning with lateral alignment. The 2D array environment imposes full decoupling of adjacent 1D antennas, which often conflicts with the LC requirement of DC biasing: the proposed design accommodates both. The LC medium is placed inside a Substrate Integrated Waveguide (SIW) modified to work as a Groove Gap Waveguide, with radiating slots etched on the upper broad wall, that radiates as a Leaky-Wave Antenna (LWA). This allows effective application of the DC bias voltage needed for tuning the LCs. At the same time, the RF field remains laterally confined, enabling the possibility to lay several antennas in parallel and achieve 2D beam scanning. The design is validated by simulation employing the actual properties of a commercial LC medium

    Synthetic Aperture Radar (SAR) Meets Deep Learning

    Get PDF
    This reprint focuses on the application of the combination of synthetic aperture radars and depth learning technology. It aims to further promote the development of SAR image intelligent interpretation technology. A synthetic aperture radar (SAR) is an important active microwave imaging sensor, whose all-day and all-weather working capacity give it an important place in the remote sensing community. Since the United States launched the first SAR satellite, SAR has received much attention in the remote sensing community, e.g., in geological exploration, topographic mapping, disaster forecast, and traffic monitoring. It is valuable and meaningful, therefore, to study SAR-based remote sensing applications. In recent years, deep learning represented by convolution neural networks has promoted significant progress in the computer vision community, e.g., in face recognition, the driverless field and Internet of things (IoT). Deep learning can enable computational models with multiple processing layers to learn data representations with multiple-level abstractions. This can greatly improve the performance of various applications. This reprint provides a platform for researchers to handle the above significant challenges and present their innovative and cutting-edge research results when applying deep learning to SAR in various manuscript types, e.g., articles, letters, reviews and technical reports

    Precision Studies of QCD in the Low Energy Domain of the EIC

    Full text link
    The manuscript focuses on the high impact science of the EIC with objective to identify a portion of the science program for QCD precision studies that requires or greatly benefits from high luminosity and low center-of-mass energies. The science topics include (1) Generalized Parton Distributions, 3D imagining and mechanical properties of the nucleon (2) mass and spin of the nucleon (3) Momentum dependence of the nucleon in semi-inclusive deep inelastic scattering (4) Exotic meson spectroscopy (5) Science highlights of nuclei (6) Precision studies of Lattice QCD in the EIC era (7) Science of far-forward particle detection (8) Radiative effects and corrections (9) Artificial Intelligence (10) EIC interaction regions for high impact science program with discovery potential. This paper documents the scientific basis for supporting such a program and helps to define the path toward the realization of the second EIC interaction region.Comment: 103 pages,47 figure

    Examining the Relationship Between Lignocellulosic Biomass Structural Constituents and Its Flow Behavior

    Get PDF
    Lignocellulosic biomass material sourced from plants and herbaceous sources is a promising substrate of inexpensive, abundant, and potentially carbon-neutral energy. One of the leading limitations of using lignocellulosic biomass as a feedstock for bioenergy products is the flow issues encountered during biomass conveyance in biorefineries. In the biorefining process, the biomass feedstock undergoes flow through a variety of conveyance systems. The inherent variability of the feedstock materials, as evidenced by their complex microstructural composition and non-uniform morphology, coupled with the varying flow conditions in the conveyance systems, gives rise to flow issues such as bridging, ratholing, and clogging. These issues slow down the conveyance process, affect machine life, and potentially lead to partial or even complete shutdown of the biorefinery. Hence, we need to improve our fundamental understanding of biomass feedstock flow physics and mechanics to address the flow issues and improve biorefinery economics. This dissertation research examines the fundamental relationship between structural constituents of diverse lignocellulosic biomass materials, i.e., cellulose, hemicellulose, and lignin, their morphology, and the impact of the structural composition and morphology on their flow behavior. First, we prepared and characterized biomass feedstocks of different chemical compositions and morphologies. Then, we conducted our fundamental investigation experimentally, through physical flow characterization tests, and computationally through high-fidelity discrete element modeling. Finally, we statistically analyzed the relative influence of the properties of lignocellulosic biomass assemblies on flow behavior to determine the most critical properties and the optimum values of flow parameters. Our research provides an experimental and computational framework to generalize findings to a wider portfolio of biomass materials. It will help the bioenergy community to design more efficient biorefining machinery and equipment, reduce the risk of failure, and improve the overall commercial viability of the bioenergy industry

    The Forward Physics Facility at the High-Luminosity LHC

    Get PDF
    High energy collisions at the High-Luminosity Large Hadron Collider (LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing LHC experiments. The proposed Forward Physics Facility (FPF), to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe standard model (SM) processes and search for physics beyond the standard model (BSM). In this report, we review the status of the civil engineering plans and the experiments to explore the diverse physics signals that can be uniquely probed in the forward region. FPF experiments will be sensitive to a broad range of BSM physics through searches for new particle scattering or decay signatures and deviations from SM expectations in high statistics analyses with TeV neutrinos in this low-background environment. High statistics neutrino detection will also provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. We report here on these physics topics, on infrastructure, detector, and simulation studies, and on future directions to realize the FPF's physics potential

    The Forward Physics Facility at the High-Luminosity LHC

    Get PDF
    High energy collisions at the High-Luminosity Large Hadron Collider (LHC) produce a large number of particles along the beam collision axis, outside of the acceptance of existing LHC experiments. The proposed Forward Physics Facility (FPF), to be located several hundred meters from the ATLAS interaction point and shielded by concrete and rock, will host a suite of experiments to probe standard model (SM) processes and search for physics beyond the standard model (BSM). In this report, we review the status of the civil engineering plans and the experiments to explore the diverse physics signals that can be uniquely probed in the forward region. FPF experiments will be sensitive to a broad range of BSM physics through searches for new particle scattering or decay signatures and deviations from SM expectations in high statistics analyses with TeV neutrinos in this low-background environment. High statistics neutrino detection will also provide valuable data for fundamental topics in perturbative and non-perturbative QCD and in weak interactions. Experiments at the FPF will enable synergies between forward particle production at the LHC and astroparticle physics to be exploited. We report here on these physics topics, on infrastructure, detector, and simulation studies, and on future directions to realize the FPF's physics potential

    Three essays in macroeconomic forecasting using dimensionality reduction methods

    Get PDF
    This thesis consists of three studies that concentrate on the dimensionality reduction methods used in macroeconomic forecasting. Chapter 2 (the first study) aims to investigates the predictive ability of several indicators of consumer sentiment and perceptions about the economy. Based on seven key qualitative questions in the University of Michigan survey of consumers, I employ various quantification approaches to construct six indexes namely sentiment, disagreement, pessimism, uncertainty, price pressure, and interest rate pressure. I establish that these six indexes convey predictability for key macroeconomic indicators beyond and above the information found in existing, popular macroeconomic and financial indicators. I also provide a deep explanation of consumer indexes by monitoring their response to supply, demand, monetary policy and financial shocks using a VAR model with sign restrictions. The results indicate that price pressure and interest rate pressure are mainly correlated with financial and uncertainty shocks, while the other indicators reflect the formation of opinions that are sensitive to shocks related to supply, demand, and monetary policy. Chapter 3 (the second study) explores the dimensionality reduction algorithm by extracting factors from a large number of predictors that take into account correlation with the predicted (target) variable, using a novel time-varying parameter three pass-regression-filter algorithm (TVP-3PRF). The benchmark 3PRF algorithm (Kelly and Pruitt, 2015) assumes that a predictor is relevant for forecasting over the whole sample and can be represented using a series of OLS regressions. I extend this approach using time-varying parameter regressions that are conveniently represented as a series of high-dimensional time-invariant regressions which can be solved using penalized likelihood estimators. TVP-3PRF algorithm allows for a subset of variables to be relevant for extracting factors at each point in time, accounting for recent evidence that economic predictors are short-lived. An empirical exercise confirms that this novel feature of TVP-3PRF algorithm is highly relevant for forecasting macroeconomic time series. Chapter 4 (the third study) determines which of the two main types of algorithms in the field of dimensionality reduction truely reflect the true way variables enter the model. It is know that in the area of modelling and forecasting highdimensional macroeconomic and financial time series, two main methods, sparse modelling and dense modelling, are both popular. However, instead of simply viewing each a method for avoiding overfitting, a question that is worth exploring is which of these models can represent the real structure of the data. Another question that arises is whether the uncertainty of variable selection will affect the prediction. In line with Giannone et al. (2021), I used their spike and slab prior to explore the scenarios for six economies when forecasting production growth. The results indicate that the way macroeconomic data are employed in the model of all the economies have an obvious sparse structure albeit with different degrees. However, the pervasiveness of uncertainty causes the sparse model to fail and the model averaging technique to become the preferred method. Moreover, what is surprising is that the dense model(ridge regression) dominated after the pandemic began

    The Forward Physics Facility at the High-Luminosity LHC

    Get PDF
    corecore