6 research outputs found

    View fusion vis-à-vis a Bayesian interpretation of Black–Litterman for portfolio allocation

    Get PDF
    The Black–Litterman model extends the framework of the Markowitz modern portfolio theory to incorporate investor views. The authors consider a case in which multiple view estimates, including uncertainties, are given for the same underlying subset of assets at a point in time. This motivates their consideration of data fusion techniques for combining information from multiple sources. In particular, they consider consistency-based methods that yield fused view and uncertainty pairs; such methods are not common to the quantitative finance literature. They show a relevant, modern case of incorporating machine learning model-derived view and uncertainty estimates, and the impact on portfolio allocation, with an example subsuming arbitrage pricing theory. Hence, they show the value of the Black– Litterman model in combination with information fusion and artificial intelligence–grounded prediction methods

    Decentralized Riemannian Particle Filtering with Applications to Multi-Agent Localization

    Get PDF
    The primary focus of this research is to develop consistent nonlinear decentralized particle filtering approaches to the problem of multiple agent localization. A key aspect in our development is the use of Riemannian geometry to exploit the inherently non-Euclidean characteristics that are typical when considering multiple agent localization scenarios. A decentralized formulation is considered due to the practical advantages it provides over centralized fusion architectures. Inspiration is taken from the relatively new field of information geometry and the more established research field of computer vision. Differential geometric tools such as manifolds, geodesics, tangent spaces, exponential, and logarithmic mappings are used extensively to describe probabilistic quantities. Numerous probabilistic parameterizations were identified, settling on the efficient square-root probability density function parameterization. The square-root parameterization has the benefit of allowing filter calculations to be carried out on the well studied Riemannian unit hypersphere. A key advantage for selecting the unit hypersphere is that it permits closed-form calculations, a characteristic that is not shared by current solution approaches. Through the use of the Riemannian geometry of the unit hypersphere, we are able to demonstrate the ability to produce estimates that are not overly optimistic. Results are presented that clearly show the ability of the proposed approaches to outperform current state-of-the-art decentralized particle filtering methods. In particular, results are presented that emphasize the achievable improvement in estimation error, estimator consistency, and required computational burden

    A stochastic method for representation, modelling and fusion of excavated material in mining

    Get PDF
    The ability to safely and economically extract raw materials such as iron ore from a greater number of remote, isolated and possibly dangerous locations will become more pressing over the coming decades as easily accessible deposits become depleted. An autonomous mining system has the potential to make the mining process more efficient, predictable and safe under these changing conditions. One of the key parts of the mining process is the estimation and tracking of bulk material through the mining production chain. Current state-of-the-art tracking and estimation systems use a deterministic representation for bulk material. This is problematic for wide-scale automation of mine processes as there is no measurement of the uncertainty in the estimates provided. A probabilistic representation is critical for autonomous systems to correctly interpret and fuse the available data in order to make the most informed decision given the available information without human intervention. This thesis investigates whether bulk material properties can be represented probabilistically through a mining production chain to provide statistically consistent estimates of the material at each stage of the production chain. Experiments and methods within this thesis focus on the load-haul-dump cycle. The development of a representation of bulk material using lumped masses is presented. A method for tracking and estimation of these lumped masses within the mining production chain using an 'Augmented State Kalman Filter' (ASKF) is developed. The method ensures that the fusion of new information at different stages will provide statistically consistent estimates of the lumped mass. There is a particular focus on the feasibility and practicality of implementing a solution on a production mine site given the current sensing technology available and how it can be adapted for use within the developed estimation system (with particular focus on remote sensing and volume estimation)

    Probabilistic Framework for Sensor Management

    Get PDF
    A probabilistic sensor management framework is introduced, which maximizes the utility of sensor systems with many different sensing modalities by dynamically configuring the sensor system in the most beneficial way. For this purpose, techniques from stochastic control and Bayesian estimation are combined such that long-term effects of possible sensor configurations and stochastic uncertainties resulting from noisy measurements can be incorporated into the sensor management decisions

    Generalized Information Representation and Compression Using Covariance Union

    No full text
    In this paper we consider the use of Covariance Union (CU) with multi-hypothesis techniques (MHT) and Gaussian Mixture Models (GMMs) to generalize the conventional mean and covariance representation of information. More specifically, we address the representation of multimodal information using multiple mean and covariance estimates. A significant challenge is to define a rigorous fusion algorithm that can bound the complexity of the filtering process. This requires a mechanism for subsuming subsets of modes into single modes so that the complexity of the representation satisfies a specified upper bound. We discuss how this can be accomplished using CU. The practical challenge is to develop efficient implementations of the CU algorithm. Because of the novelty of the CU algorithm, there are no existing real-time codes for use in real applications. In this paper we address this deficiency by considering a general-purpose implementation of the CU algorithm based on general nonlinear optimization techniques. Computational results are reported
    corecore