912 research outputs found
UMSL Bulletin 2023-2024
The 2023-2024 Bulletin and Course Catalog for the University of Missouri St. Louis.https://irl.umsl.edu/bulletin/1088/thumbnail.jp
Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5
This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered.
First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes.
Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification.
Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well
2017 GREAT Day Program
SUNY Geneseo’s Eleventh Annual GREAT Day.https://knightscholar.geneseo.edu/program-2007/1011/thumbnail.jp
Implicit Loss of Surjectivity and Facial Reduction: Theory and Applications
Facial reduction, pioneered by Borwein and Wolkowicz, is a preprocessing method that is commonly used to obtain strict feasibility in the reformulated, reduced constraint system.
The importance of strict feasibility is often addressed in the context of the convergence results for interior point methods.
Beyond the theoretical properties that the facial reduction conveys, we show that facial reduction, not only limited to interior point methods, leads to strong numerical performances in different classes of algorithms.
In this thesis we study various consequences and the broad applicability of facial reduction.
The thesis is organized in two parts.
In the first part, we show the instabilities accompanied by the absence
of strict feasibility through the lens of facially reduced systems.
In particular, we exploit the implicit redundancies, revealed by each nontrivial facial reduction step, resulting in the implicit loss of surjectivity.
This leads to the two-step facial reduction and two novel related notions of singularity.
For the area of semidefinite programming, we use these singularities to strengthen a known bound on the solution rank, the Barvinok-Pataki bound.
For the area of linear programming, we reveal degeneracies caused by the implicit redundancies.
Furthermore, we propose a preprocessing tool that uses the simplex method.
In the second part of this thesis, we continue with the semidefinite programs that do not have strictly feasible points.
We focus on the doubly-nonnegative relaxation of the binary quadratic program and a semidefinite program with a nonlinear objective function.
We closely work with two classes of algorithms, the splitting method and the Gauss-Newton interior point method.
We elaborate on the advantages in building models from facial reduction. Moreover, we develop algorithms for real-world problems including the quadratic assignment problem, the protein side-chain positioning problem, and the key rate computation for quantum key distribution.
Facial reduction continues to play an important role for
providing robust reformulated models in both the theoretical and the practical aspects, resulting in successful numerical performances
Modern meat: the next generation of meat from cells
Modern Meat is the first textbook on cultivated meat, with contributions from over 100 experts within the cultivated meat community.
The Sections of Modern Meat comprise 5 broad categories of cultivated meat: Context, Impact, Science, Society, and World.
The 19 chapters of Modern Meat, spread across these 5 sections, provide detailed entries on cultivated meat. They extensively tour a range of topics including the impact of cultivated meat on humans and animals, the bioprocess of cultivated meat production, how cultivated meat may become a food option in Space and on Mars, and how cultivated meat may impact the economy, culture, and tradition of Asia
CRAFT: Concept Recursive Activation FacTorization for Explainability
Attribution methods, which employ heatmaps to identify the most influential
regions of an image that impact model decisions, have gained widespread
popularity as a type of explainability method. However, recent research has
exposed the limited practical value of these methods, attributed in part to
their narrow focus on the most prominent regions of an image -- revealing
"where" the model looks, but failing to elucidate "what" the model sees in
those areas. In this work, we try to fill in this gap with CRAFT -- a novel
approach to identify both "what" and "where" by generating concept-based
explanations. We introduce 3 new ingredients to the automatic concept
extraction literature: (i) a recursive strategy to detect and decompose
concepts across layers, (ii) a novel method for a more faithful estimation of
concept importance using Sobol indices, and (iii) the use of implicit
differentiation to unlock Concept Attribution Maps.
We conduct both human and computer vision experiments to demonstrate the
benefits of the proposed approach. We show that the proposed concept importance
estimation technique is more faithful to the model than previous methods. When
evaluating the usefulness of the method for human experimenters on a
human-centered utility benchmark, we find that our approach significantly
improves on two of the three test scenarios. Our code is freely available at
github.com/deel-ai/Craft
The Minimization of Piecewise Functions: Pseudo Stationarity
There are many significant applied contexts that require the solution of
discontinuous optimization problems in finite dimensions. Yet these problems
are very difficult, both computationally and analytically. With the functions
being discontinuous and a minimizer (local or global) of the problems, even if
it exists, being impossible to verifiably compute, a foremost question is what
kind of ''stationary solutions'' one can expect to obtain; these solutions
provide promising candidates for minimizers; i.e., their defining conditions
are necessary for optimality. Motivated by recent results on sparse
optimization, we introduce in this paper such a kind of solution, termed
''pseudo B- (for Bouligand) stationary solution'', for a broad class of
discontinuous piecewise continuous optimization problems with objective and
constraint defined by indicator functions of the positive real axis composite
with functions that are possibly nonsmooth. We present two approaches for
computing such a solution. One approach is based on lifting the problem to a
higher dimension via the epigraphical formulation of the indicator functions;
this requires the addition of some auxiliary variables. The other approach is
based on certain continuous (albeit not necessarily differentiable) piecewise
approximations of the indicator functions and the convergence to a pseudo
B-stationary solution of the original problem is established. The conditions
for convergence are discussed and illustrated by an example
Notes on the value function approach to multiobjective bilevel optimization
This paper is concerned with the value function approach to multiobjective
bilevel optimization which exploits a lower level frontier-type mapping in
order to replace the hierarchical model of two interdependent multiobjective
optimization problems by a single-level multiobjective optimization problem. As
a starting point, different value-function-type reformulations are suggested
and their relations are discussed. Here, we focus on the situations where the
lower level problem is solved up to efficiency or weak efficiency, and an
intermediate solution concept is suggested as well. We study the
graph-closedness of the associated efficiency-type and frontier-type mappings.
These findings are then used for two purposes. First, we investigate existence
results in multiobjective bilevel optimization. Second, for the derivation of
necessary optimality conditions via the value function approach, it is inherent
to differentiate frontier-type mappings in a generalized way. Here, we are
concerned with the computation of upper coderivative estimates for the
frontier-type mapping associated with the setting where the lower level problem
is solved up to weak efficiency. We proceed in two ways, relying, on the one
hand, on a weak domination property and, on the other hand, on a scalarization
approach. Throughout the paper, illustrative examples visualize our findings,
the necessity of crucial assumptions, and some flaws in the related literature.Comment: 30 page
Beyond Quantity: Research with Subsymbolic AI
How do artificial neural networks and other forms of artificial intelligence interfere with methods and practices in the sciences? Which interdisciplinary epistemological challenges arise when we think about the use of AI beyond its dependency on big data? Not only the natural sciences, but also the social sciences and the humanities seem to be increasingly affected by current approaches of subsymbolic AI, which master problems of quality (fuzziness, uncertainty) in a hitherto unknown way. But what are the conditions, implications, and effects of these (potential) epistemic transformations and how must research on AI be configured to address them adequately
- …