283 research outputs found

    Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5

    Get PDF
    This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered. First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes. Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification. Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well

    Chatbots for Modelling, Modelling of Chatbots

    Full text link
    Tesis Doctoral inédita leída en la Universidad Autónoma de Madrid, Escuela Politécnica Superior, Departamento de Ingeniería Informática. Fecha de Lectura: 28-03-202

    LIPIcs, Volume 261, ICALP 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 261, ICALP 2023, Complete Volum

    Data analysis with merge trees

    Get PDF
    Today’s data are increasingly complex and classical statistical techniques need growingly more refined mathematical tools to be able to model and investigate them. Paradigmatic situations are represented by data which need to be considered up to some kind of trans- formation and all those circumstances in which the analyst finds himself in the need of defining a general concept of shape. Topological Data Analysis (TDA) is a field which is fundamentally contributing to such challenges by extracting topological information from data with a plethora of interpretable and computationally accessible pipelines. We con- tribute to this field by developing a series of novel tools, techniques and applications to work with a particular topological summary called merge tree. To analyze sets of merge trees we introduce a novel metric structure along with an algorithm to compute it, define a framework to compare different functions defined on merge trees and investigate the metric space obtained with the aforementioned metric. Different geometric and topolog- ical properties of the space of merge trees are established, with the aim of obtaining a deeper understanding of such trees. To showcase the effectiveness of the proposed metric, we develop an application in the field of Functional Data Analysis, working with functions up to homeomorphic reparametrization, and in the field of radiomics, where each patient is represented via a clustering dendrogram

    Forward uncertainty quantification with special emphasis on a Bayesian active learning perspective

    Get PDF
    Uncertainty quantification (UQ) in its broadest sense aims at quantitatively studying all sources of uncertainty arising from both computational and real-world applications. Although many subtopics appear in the UQ field, there are typically two major types of UQ problems: forward and inverse uncertainty propagation. The present study focuses on the former, which involves assessing the effects of the input uncertainty in various forms on the output response of a computational model. In total, this thesis reports nine main developments in the context of forward uncertainty propagation, with special emphasis on a Bayesian active learning perspective. The first development is concerned with estimating the extreme value distribution and small first-passage probabilities of uncertain nonlinear structures under stochastic seismic excitations, where a moment-generating function-based mixture distribution approach (MGF-MD) is proposed. As the second development, a triple-engine parallel Bayesian global optimization (T-PBGO) method is presented for interval uncertainty propagation. The third contribution develops a parallel Bayesian quadrature optimization (PBQO) method for estimating the response expectation function, its variable importance and bounds when a computational model is subject to hybrid uncertainties in the form of random variables, parametric probability boxes (p-boxes) and interval models. In the fourth research, of interest is the failure probability function when the inputs of a performance function are characterized by parametric p-boxes. To do so, an active learning augmented probabilistic integration (ALAPI) method is proposed based on offering a partially Bayesian active learning perspective on failure probability estimation, as well as the use of high-dimensional model representation (HDMR) technique. Note that in this work we derive an upper-bound of the posterior variance of the failure probability, which bounds our epistemic uncertainty about the failure probability due to a kind of numerical uncertainty, i.e., discretization error. The fifth contribution further strengthens the previously developed active learning probabilistic integration (ALPI) method in two ways, i.e., enabling the use of parallel computing and enhancing the capability of assessing small failure probabilities. The resulting method is called parallel adaptive Bayesian quadrature (PABQ). The sixth research presents a principled Bayesian failure probability inference (BFPI) framework, where the posterior variance of the failure probability is derived (not in closed form). Besides, we also develop a parallel adaptive-Bayesian failure probability learning (PA-BFPI) method upon the BFPI framework. For the seventh development, we propose a partially Bayesian active learning line sampling (PBAL-LS) method for assessing extremely small failure probabilities, where a partially Bayesian active learning insight is offered for the classical LS method and an upper-bound for the posterior variance of the failure probability is deduced. Following the PBAL-LS method, the eighth contribution finally obtains the expression of the posterior variance of the failure probability in the LS framework, and a Bayesian active learning line sampling (BALLS) method is put forward. The ninth contribution provides another Bayesian active learning alternative, Bayesian active learning line sampling with log-normal process (BAL-LS-LP), to the traditional LS. In this method, the log-normal process prior, instead of a Gaussian process prior, is assumed for the beta function so as to account for the non-negativity constraint. Besides, the approximation error resulting from the root-finding procedure is also taken into consideration. In conclusion, this thesis presents a set of novel computational methods for forward UQ, especially from a Bayesian active learning perspective. The developed methods are expected to enrich our toolbox for forward UQ analysis, and the insights gained can stimulate further studies

    Update-Aware Information Extraction

    Get PDF
    Information extraction programs (extractors) can be applied to documents to isolate structured versions of some content by creating tabular records corresponding to facts found in the documents. When extracted relations or source documents are updated, we wish to ensure that those changes are propagated correctly. That is, we recommend that extracted relations be treated as materialized views over the document database. Because extraction is expensive, maintaining extracted relations in the presence of frequent document updates comes at a high execution cost. We propose a practical framework to effectively update extracted views to represent the most recent version of documents. Our approach entails conducting static analyses of extraction and update programs within a framework compatible with SystemT, a renowned extraction framework based on regular expressions. We describe a multi-level verification process aimed at efficiently identifying document updates for which we can autonomously compute the updated extracted views. Through comprehensive experimentation, we demonstrate the effectiveness of our approach within real-world extraction scenarios. For the reverse problem, we need to translate updates on extracted views into corresponding document updates. We rely on a translation mechanism that is based on value substitution in the source documents. We classify extractors amenable to value substitution as stable extractors. We again leverage static analyses of extraction programs to study stability for extractors expressed in a significant subset of JAPE, another rule-based extraction language. Using a document spanner representation of the JAPE program, we identify four sufficient properties for being able to translate updates back to the documents and use them to verify whether an input JAPE program is stable

    Making Presentation Math Computable

    Get PDF
    This Open-Access-book addresses the issue of translating mathematical expressions from LaTeX to the syntax of Computer Algebra Systems (CAS). Over the past decades, especially in the domain of Sciences, Technology, Engineering, and Mathematics (STEM), LaTeX has become the de-facto standard to typeset mathematical formulae in publications. Since scientists are generally required to publish their work, LaTeX has become an integral part of today's publishing workflow. On the other hand, modern research increasingly relies on CAS to simplify, manipulate, compute, and visualize mathematics. However, existing LaTeX import functions in CAS are limited to simple arithmetic expressions and are, therefore, insufficient for most use cases. Consequently, the workflow of experimenting and publishing in the Sciences often includes time-consuming and error-prone manual conversions between presentational LaTeX and computational CAS formats. To address the lack of a reliable and comprehensive translation tool between LaTeX and CAS, this thesis makes the following three contributions. First, it provides an approach to semantically enhance LaTeX expressions with sufficient semantic information for translations into CAS syntaxes. Second, it demonstrates the first context-aware LaTeX to CAS translation framework LaCASt. Third, the thesis provides a novel approach to evaluate the performance for LaTeX to CAS translations on large-scaled datasets with an automatic verification of equations in digital mathematical libraries. This is an open access book

    Syntax-semantics interface: an algebraic model

    Full text link
    We extend our formulation of Merge and Minimalism in terms of Hopf algebras to an algebraic model of a syntactic-semantic interface. We show that methods adopted in the formulation of renormalization (extraction of meaningful physical values) in theoretical physics are relevant to describe the extraction of meaning from syntactic expressions. We show how this formulation relates to computational models of semantics and we answer some recent controversies about implications for generative linguistics of the current functioning of large language models.Comment: LaTeX, 75 pages, 19 figure

    Human History and Digital Future

    Get PDF
    Korrigierter Nachdruck. Im Kapitel "Wallace/Moullou: Viability of Production and Implementation of Retrospective Photogrammetry in Archaeology" wurden die Acknowledgemens enfternt.The Proceedings of the 46th Annual Conference on Computer Applications and Quantitative Methods in Archaeology, held between March 19th and 23th, 2018 at the University of Tübingen, Germany, discuss the current questions concerning digital recording, computer analysis, graphic and 3D visualization, data management and communication in the field of archaeology. Through a selection of diverse case studies from all over the world, the proceedings give an overview on new technical approaches and best practice from various archaeological and computer-science disciplines
    corecore