239 research outputs found
Advances and Applications of DSmT for Information Fusion. Collected Works, Volume 5
This fifth volume on Advances and Applications of DSmT for Information Fusion collects theoretical and applied contributions of researchers working in different fields of applications and in mathematics, and is available in open-access. The collected contributions of this volume have either been published or presented after disseminating the fourth volume in 2015 in international conferences, seminars, workshops and journals, or they are new. The contributions of each part of this volume are chronologically ordered.
First Part of this book presents some theoretical advances on DSmT, dealing mainly with modified Proportional Conflict Redistribution Rules (PCR) of combination with degree of intersection, coarsening techniques, interval calculus for PCR thanks to set inversion via interval analysis (SIVIA), rough set classifiers, canonical decomposition of dichotomous belief functions, fast PCR fusion, fast inter-criteria analysis with PCR, and improved PCR5 and PCR6 rules preserving the (quasi-)neutrality of (quasi-)vacuous belief assignment in the fusion of sources of evidence with their Matlab codes.
Because more applications of DSmT have emerged in the past years since the apparition of the fourth book of DSmT in 2015, the second part of this volume is about selected applications of DSmT mainly in building change detection, object recognition, quality of data association in tracking, perception in robotics, risk assessment for torrent protection and multi-criteria decision-making, multi-modal image fusion, coarsening techniques, recommender system, levee characterization and assessment, human heading perception, trust assessment, robotics, biometrics, failure detection, GPS systems, inter-criteria analysis, group decision, human activity recognition, storm prediction, data association for autonomous vehicles, identification of maritime vessels, fusion of support vector machines (SVM), Silx-Furtif RUST code library for information fusion including PCR rules, and network for ship classification.
Finally, the third part presents interesting contributions related to belief functions in general published or presented along the years since 2015. These contributions are related with decision-making under uncertainty, belief approximations, probability transformations, new distances between belief functions, non-classical multi-criteria decision-making problems with belief functions, generalization of Bayes theorem, image processing, data association, entropy and cross-entropy measures, fuzzy evidence numbers, negator of belief mass, human activity recognition, information fusion for breast cancer therapy, imbalanced data classification, and hybrid techniques mixing deep learning with belief functions as well
Geometric Data Analysis: Advancements of the Statistical Methodology and Applications
Data analysis has become fundamental to our society and comes in multiple facets and approaches. Nevertheless, in research and applications, the focus was primarily on data from Euclidean vector spaces. Consequently, the majority of methods that are applied today are not suited for more general data types. Driven by needs from fields like image processing, (medical) shape analysis, and network analysis, more and more attention has recently been given to data from non-Euclidean spaces–particularly (curved) manifolds. It has led to the field of geometric data analysis whose methods explicitly take the structure (for example, the topology and geometry) of the underlying space into account.
This thesis contributes to the methodology of geometric data analysis by generalizing several fundamental notions from multivariate statistics to manifolds. We thereby focus on two different viewpoints.
First, we use Riemannian structures to derive a novel regression scheme for general manifolds that relies on splines of generalized Bézier curves. It can accurately model non-geodesic relationships, for example, time-dependent trends with saturation effects or cyclic trends. Since Bézier curves can be evaluated with the constructive de Casteljau algorithm, working with data from manifolds of high dimensions (for example, a hundred thousand or more) is feasible. Relying on the regression, we further develop
a hierarchical statistical model for an adequate analysis of longitudinal data in manifolds, and a method to control for confounding variables.
We secondly focus on data that is not only manifold- but even Lie group-valued, which is frequently the case in applications. We can only achieve this by endowing the group with an affine connection structure that is generally not Riemannian. Utilizing it, we derive generalizations of several well-known dissimilarity measures between data distributions that can be used for various tasks, including hypothesis testing. Invariance under data translations is proven, and a connection to continuous distributions is given for one measure.
A further central contribution of this thesis is that it shows use cases for all notions in real-world applications, particularly in problems from shape analysis in medical imaging and archaeology. We can replicate or further quantify several known findings for shape changes of the femur and the right hippocampus under osteoarthritis and Alzheimer's, respectively. Furthermore, in an archaeological application, we obtain new insights into the construction principles of ancient sundials. Last but not least, we use the geometric structure underlying human brain connectomes to predict cognitive scores. Utilizing a sample selection procedure, we obtain state-of-the-art results
Development of Bridge Information Model (BrIM) for digital twinning and management using TLS technology
In the current modern era of information and technology, the concept of Building Information Model (BIM), has made revolutionary changes in different aspects of engineering design, construction, and management of infrastructure assets, especially bridges. In the field of bridge engineering, Bridge Information Model (BrIM), as a specific form of BIM, includes digital twining of the physical asset associated with geometrical inspections and non-geometrical data, which has eliminated the use of traditional paper-based documentation and hand-written reports, enabling professionals and managers to operate more efficiently and effectively. However, concerns remain about the quality of the acquired inspection data and utilizing BrIM information for remedial decisions in a reliable Bridge Management System (BMS) which are still reliant on the knowledge and experience of the involved inspectors, or asset manager, and are susceptible to a certain degree of subjectivity. Therefore, this research study aims not only to introduce the valuable benefits of Terrestrial Laser Scanning (TLS) as a precise, rapid, and qualitative inspection method, but also to serve a novel sliced-based approach for bridge geometric Computer-Aided Design (CAD) model extraction using TLS-based point cloud, and to contribute to BrIM development. Moreover, this study presents a comprehensive methodology for incorporating generated BrIM in a redeveloped element-based condition assessment model while integrating a Decision Support System (DSS) to propose an innovative BMS. This methodology was further implemented in a designed software plugin and validated by a real case study on the Werrington Bridge, a cable-stayed bridge in New South Wales, Australia. The finding of this research confirms the reliability of the TLS-derived 3D model in terms of quality of acquired data and accuracy of the proposed novel slice-based method, as well as BrIM implementation, and integration of the proposed BMS into the developed BrIM. Furthermore, the results of this study showed that the proposed integrated model addresses the subjective nature of decision-making by conducting a risk assessment and utilising structured decision-making tools for priority ranking of remedial actions. The findings demonstrated acceptable agreement in utilizing the proposed BMS for priority ranking of structural elements that require more attention, as well as efficient optimisation of remedial actions to preserve bridge health and safety
An Object SLAM Framework for Association, Mapping, and High-Level Tasks
Object SLAM is considered increasingly significant for robot high-level
perception and decision-making. Existing studies fall short in terms of data
association, object representation, and semantic mapping and frequently rely on
additional assumptions, limiting their performance. In this paper, we present a
comprehensive object SLAM framework that focuses on object-based perception and
object-oriented robot tasks. First, we propose an ensemble data association
approach for associating objects in complicated conditions by incorporating
parametric and nonparametric statistic testing. In addition, we suggest an
outlier-robust centroid and scale estimation algorithm for modeling objects
based on the iForest and line alignment. Then a lightweight and object-oriented
map is represented by estimated general object models. Taking into
consideration the semantic invariance of objects, we convert the object map to
a topological map to provide semantic descriptors to enable multi-map matching.
Finally, we suggest an object-driven active exploration strategy to achieve
autonomous mapping in the grasping scenario. A range of public datasets and
real-world results in mapping, augmented reality, scene matching,
relocalization, and robotic manipulation have been used to evaluate the
proposed object SLAM framework for its efficient performance.Comment: Accepted by IEEE Transactions on Robotics(T-RO
Integrality and cutting planes in semidefinite programming approaches for combinatorial optimization
Many real-life decision problems are discrete in nature. To solve such problems as mathematical optimization problems, integrality constraints are commonly incorporated in the model to reflect the choice of finitely many alternatives. At the same time, it is known that semidefinite programming is very suitable for obtaining strong relaxations of combinatorial optimization problems. In this dissertation, we study the interplay between semidefinite programming and integrality, where a special focus is put on the use of cutting-plane methods. Although the notions of integrality and cutting planes are well-studied in linear programming, integer semidefinite programs (ISDPs) are considered only recently. We show that manycombinatorial optimization problems can be modeled as ISDPs. Several theoretical concepts, such as the Chvátal-Gomory closure, total dual integrality and integer Lagrangian duality, are studied for the case of integer semidefinite programming. On the practical side, we introduce an improved branch-and-cut approach for ISDPs and a cutting-plane augmented Lagrangian method for solving semidefinite programs with a large number of cutting planes. Throughout the thesis, we apply our results to a wide range of combinatorial optimization problems, among which the quadratic cycle cover problem, the quadratic traveling salesman problem and the graph partition problem. Our approaches lead to novel, strong and efficient solution strategies for these problems, with the potential to be extended to other problem classes
Biological Protein Patterning Systems across the Domains of Life: from Experiments to Modelling
Distinct localisation of macromolecular structures relative to cell shape is a common feature across the domains of life. One mechanism for achieving spatiotemporal intracellular organisation is the Turing reaction-diffusion system (e.g. Min system in the bacterium Escherichia coli controlling in cell division). In this thesis, I explore potential Turing systems in archaea and eukaryotes as well as the effects of subdiffusion. Recently, a MinD homologue, MinD4, in the archaeon Haloferax volcanii was found to form a dynamic spatiotemporal pattern that is distinct from E. coli in its localisation and function. I investigate all four archaeal Min paralogue systems in H. volcanii by identifying four putative MinD activator proteins based on their genomic location and show that they alter motility but do not control MinD4 patterning. Additionally, one of these proteins shows remarkably fast dynamic motion with speeds comparable to eukaryotic molecular motors, while its function appears to be to control motility via interaction with the archaellum. In metazoa, neurons are highly specialised cells whose functions rely on the proper segregation of proteins to the axonal and somatodendritic compartments. These compartments are bounded by a structure called the axon initial segment (AIS) which is precisely positioned in the proximal axonal region during early neuronal development. How neurons control these self-organised localisations is poorly understood. Using a top-down analysis of developing neurons in vitro, I show that the AIS lies at the nodal plane of the first non-homogeneous spatial harmonic of the neuron shape while a key axonal protein, Tau, is distributed with a concentration that matches the same harmonic. These results are consistent with an underlying Turing patterning system which remains to be identified. The complex intracellular environment often gives rise to the subdiffusive dynamics of molecules that may affect patterning. To simulate the subdiffusive transport of biopolymers, I develop a stochastic simulation algorithm based on the continuous time random walk framework, which is then applied to a model of a dimeric molecular motor. This provides insight into the effects of subdiffusion on motor dynamics, where subdiffusion reduces motor speed while increasing the stall force. Overall, this thesis makes progress towards understanding intracellular patterning systems in different organisms, across the domains of life
NeuroGF: A Neural Representation for Fast Geodesic Distance and Path Queries
Geodesics are essential in many geometry processing applications. However,
traditional algorithms for computing geodesic distances and paths on 3D mesh
models are often inefficient and slow. This makes them impractical for
scenarios that require extensive querying of arbitrary point-to-point
geodesics. Although neural implicit representations have emerged as a popular
way of representing 3D shape geometries, there is still no research on
representing geodesics with deep implicit functions. To bridge this gap, this
paper presents the first attempt to represent geodesics on 3D mesh models using
neural implicit functions. Specifically, we introduce neural geodesic fields
(NeuroGFs), which are learned to represent the all-pairs geodesics of a given
mesh. By using NeuroGFs, we can efficiently and accurately answer queries of
arbitrary point-to-point geodesic distances and paths, overcoming the
limitations of traditional algorithms. Evaluations on common 3D models show
that NeuroGFs exhibit exceptional performance in solving the single-source
all-destination (SSAD) and point-to-point geodesics, and achieve high accuracy
consistently. Moreover, NeuroGFs offer the unique advantage of encoding both 3D
geometry and geodesics in a unified representation. Code is made available at
https://github.com/keeganhk/NeuroGF/tree/master
The Fifteenth Marcel Grossmann Meeting
The three volumes of the proceedings of MG15 give a broad view of all aspects of gravitational physics and astrophysics, from mathematical issues to recent observations and experiments. The scientific program of the meeting included 40 morning plenary talks over 6 days, 5 evening popular talks and nearly 100 parallel sessions on 71 topics spread over 4 afternoons. These proceedings are a representative sample of the very many oral and poster presentations made at the meeting.Part A contains plenary and review articles and the contributions from some parallel sessions, while Parts B and C consist of those from the remaining parallel sessions. The contents range from the mathematical foundations of classical and quantum gravitational theories including recent developments in string theory, to precision tests of general relativity including progress towards the detection of gravitational waves, and from supernova cosmology to relativistic astrophysics, including topics such as gamma ray bursts, black hole physics both in our galaxy and in active galactic nuclei in other galaxies, and neutron star, pulsar and white dwarf astrophysics. Parallel sessions touch on dark matter, neutrinos, X-ray sources, astrophysical black holes, neutron stars, white dwarfs, binary systems, radiative transfer, accretion disks, quasars, gamma ray bursts, supernovas, alternative gravitational theories, perturbations of collapsed objects, analog models, black hole thermodynamics, numerical relativity, gravitational lensing, large scale structure, observational cosmology, early universe models and cosmic microwave background anisotropies, inhomogeneous cosmology, inflation, global structure, singularities, chaos, Einstein-Maxwell systems, wormholes, exact solutions of Einstein's equations, gravitational waves, gravitational wave detectors and data analysis, precision gravitational measurements, quantum gravity and loop quantum gravity, quantum cosmology, strings and branes, self-gravitating systems, gamma ray astronomy, cosmic rays and the history of general relativity
Probabilistic Numerical Linear Algebra for Machine Learning
Machine learning models are becoming increasingly essential in domains where critical decisions must be made under uncertainty, such as in public policy, medicine or robotics. For a model to be useful for decision-making, it must convey a degree of certainty in its predictions. Bayesian models are well-suited to such settings due to their principled uncertainty quantification, given a set of assumptions about the problem and data-generating process. While in theory, inference in a Bayesian model is fully specified, in practice, numerical approximations have a significant impact on the resulting posterior. Therefore, model-based decisions are not just determined by the data but also by the numerical method. This begs the question of how we can account for the adverse impact of numerical approximations on inference.
Arguably, the most common numerical task in scientific computing is the solution of linear systems, which arise in probabilistic inference, graph theory, differential equations and optimization. In machine learning, these systems are typically large-scale, subject to noise and arise from
generative processes. These unique characteristics call for specialized solvers. In this thesis, we propose a class of probabilistic linear solvers, which infer the solution to a linear system and can be interpreted as learning algorithms themselves. Importantly, they can leverage problem structure and propagate their error to the prediction of the underlying probabilistic model. Next, we apply such solvers to accelerate Gaussian process inference. While Gaussian processes are a principled and flexible model class, for large datasets inference is computationally prohibitive both in time and memory due to the required computations with the kernel matrix. We show that by approximating the posterior with a probabilistic linear solver, we can invest an arbitrarily small
amount of computation and still obtain a provably coherent prediction that quantifies uncertainty exactly. Finally, we demonstrate that Gaussian process hyperparameter optimization can similarly be accelerated by leveraging structural prior knowledge in the model via preconditioning of iterative methods. Combined with modern parallel hardware, this enables training Gaussian process models on datasets with hundreds of thousands of data points.
In summary, we demonstrate that interpreting numerical methods in linear algebra as probabilistic
learning algorithms unlocks significant performance improvements for Gaussian process models. Crucially, we show how to account for the impact of numerical approximations on model predictions via uncertainty quantification. This enables an explicit trade-off between computational resources and confidence in a prediction. The techniques developed in this thesis have advanced the understanding of probabilistic linear solvers, they have shifted the goalposts of what can be expected from Gaussian process approximations and they have defined the way large-scale Gaussian process hyperparameter optimization is performed in GPyTorch, arguably the most popular library for Gaussian processes in Python
Latent Disentanglement for the Analysis and Generation of Digital Human Shapes
Analysing and generating digital human shapes is crucial for a wide variety of applications ranging from movie production to healthcare. The most common approaches for the analysis and generation of digital human shapes involve the creation of statistical shape models. At the heart of these techniques is the definition of a mapping between shapes and a low-dimensional representation. However, making these representations interpretable is still an open challenge. This thesis explores latent disentanglement as a powerful technique to make the latent space of geometric deep learning based statistical shape models more structured and interpretable. In particular, it introduces two novel techniques to disentangle the latent representation of variational autoencoders and generative adversarial networks with respect to the local shape attributes characterising the identity of the generated body and head meshes. This work was inspired by a shape completion framework that was proposed as a viable alternative to intraoperative registration in minimally invasive surgery of the liver. In addition, one of these methods for latent disentanglement was also applied to plastic surgery, where it was shown to improve the diagnosis of craniofacial syndromes and aid surgical planning
- …