1,081 research outputs found

    Modeling a Sensor to Improve its Efficacy

    Get PDF
    Robots rely on sensors to provide them with information about their surroundings. However, high-quality sensors can be extremely expensive and cost-prohibitive. Thus many robotic systems must make due with lower-quality sensors. Here we demonstrate via a case study how modeling a sensor can improve its efficacy when employed within a Bayesian inferential framework. As a test bed we employ a robotic arm that is designed to autonomously take its own measurements using an inexpensive LEGO light sensor to estimate the position and radius of a white circle on a black field. The light sensor integrates the light arriving from a spatially distributed region within its field of view weighted by its Spatial Sensitivity Function (SSF). We demonstrate that by incorporating an accurate model of the light sensor SSF into the likelihood function of a Bayesian inference engine, an autonomous system can make improved inferences about its surroundings. The method presented here is data-based, fairly general, and made with plug-and play in mind so that it could be implemented in similar problems.Comment: 18 pages, 8 figures, submitted to the special issue of "Sensors for Robotics

    Quantum computers can search arbitrarily large databases by a single query

    Full text link
    This paper shows that a quantum mechanical algorithm that can query information relating to multiple items of the database, can search a database in a single query (a query is defined as any question to the database to which the database has to return a (YES/NO) answer). A classical algorithm will be limited to the information theoretic bound of at least O(log N) queries (which it would achieve by using a binary search).Comment: Several enhancements to the original pape

    Maximum Joint Entropy and Information-Based Collaboration of Automated Learning Machines

    Full text link
    We are working to develop automated intelligent agents, which can act and react as learning machines with minimal human intervention. To accomplish this, an intelligent agent is viewed as a question-asking machine, which is designed by coupling the processes of inference and inquiry to form a model-based learning unit. In order to select maximally-informative queries, the intelligent agent needs to be able to compute the relevance of a question. This is accomplished by employing the inquiry calculus, which is dual to the probability calculus, and extends information theory by explicitly requiring context. Here, we consider the interaction between two question-asking intelligent agents, and note that there is a potential information redundancy with respect to the two questions that the agents may choose to pose. We show that the information redundancy is minimized by maximizing the joint entropy of the questions, which simultaneously maximizes the relevance of each question while minimizing the mutual information between them. Maximum joint entropy is therefore an important principle of information-based collaboration, which enables intelligent agents to efficiently learn together.Comment: 8 pages, 1 figure, to appear in the proceedings of MaxEnt 2011 held in Waterloo, Canad

    Origins of the Combinatorial Basis of Entropy

    Full text link
    The combinatorial basis of entropy, given by Boltzmann, can be written H=N1lnWH = N^{-1} \ln \mathbb{W}, where HH is the dimensionless entropy, NN is the number of entities and W\mathbb{W} is number of ways in which a given realization of a system can occur (its statistical weight). This can be broadened to give generalized combinatorial (or probabilistic) definitions of entropy and cross-entropy: H=κ(ϕ(W)+C)H=\kappa (\phi(\mathbb{W}) +C) and D=κ(ϕ(P)+C)D=-\kappa (\phi(\mathbb{P}) +C), where P\mathbb{P} is the probability of a given realization, ϕ\phi is a convenient transformation function, κ\kappa is a scaling parameter and CC an arbitrary constant. If W\mathbb{W} or P\mathbb{P} satisfy the multinomial weight or distribution, then using ϕ()=ln()\phi(\cdot)=\ln(\cdot) and κ=N1\kappa=N^{-1}, HH and DD asymptotically converge to the Shannon and Kullback-Leibler functions. In general, however, W\mathbb{W} or P\mathbb{P} need not be multinomial, nor may they approach an asymptotic limit. In such cases, the entropy or cross-entropy function can be {\it defined} so that its extremization ("MaxEnt'' or "MinXEnt"), subject to the constraints, gives the ``most probable'' (``MaxProb'') realization of the system. This gives a probabilistic basis for MaxEnt and MinXEnt, independent of any information-theoretic justification. This work examines the origins of the governing distribution P\mathbb{P}.... (truncated)Comment: MaxEnt07 manuscript, version 4 revise

    The Spatial Sensitivity Function of a Light Sensor

    Full text link
    The Spatial Sensitivity Function (SSF) is used to quantify a detector's sensitivity to a spatially-distributed input signal. By weighting the incoming signal with the SSF and integrating, the overall scalar response of the detector can be estimated. This project focuses on estimating the SSF of a light intensity sensor consisting of a photodiode. This light sensor has been used previously in the Knuth Cyberphysics Laboratory on a robotic arm that performs its own experiments to locate a white circle in a dark field (Knuth et al., 2007). To use the light sensor to learn about its surroundings, the robot's inference software must be able to model and predict the light sensor's response to a hypothesized stimulus. Previous models of the light sensor treated it as a point sensor and ignored its spatial characteristics. Here we propose a parametric approach where the SSF is described by a mixture of Gaussians (MOG). By performing controlled calibration experiments with known stimulus inputs, we used nested sampling to estimate the SSF of the light sensor using an MOG model with the number of Gaussians ranging from one to five. By comparing the evidence computed for each MOG model, we found that one Gaussian is sufficient to describe the SSF to the accuracy we require. Future work will involve incorporating this more accurate SSF into the Bayesian machine learning software for the robotic system and studying how this detailed information about the properties of the light sensor will improve robot's ability to learn.Comment: Published in MaxEnt 200

    Origin of Complex Quantum Amplitudes and Feynman's Rules

    Full text link
    Complex numbers are an intrinsic part of the mathematical formalism of quantum theory, and are perhaps its most mysterious feature. In this paper, we show that the complex nature of the quantum formalism can be derived directly from the assumption that a pair of real numbers is associated with each sequence of measurement outcomes, with the probability of this sequence being a real-valued function of this number pair. By making use of elementary symmetry conditions, and without assuming that these real number pairs have any other algebraic structure, we show that these pairs must be manipulated according to the rules of complex arithmetic. We demonstrate that these complex numbers combine according to Feynman's sum and product rules, with the modulus-squared yielding the probability of a sequence of outcomes.Comment: v2: Clarifications, and minor corrections and modifications. Results unchanged. v3: Minor changes to introduction and conclusio

    Formalizing Size-Optimal Sorting Networks: Extracting a Certified Proof Checker

    Full text link
    Since the proof of the four color theorem in 1976, computer-generated proofs have become a reality in mathematics and computer science. During the last decade, we have seen formal proofs using verified proof assistants being used to verify the validity of such proofs. In this paper, we describe a formalized theory of size-optimal sorting networks. From this formalization we extract a certified checker that successfully verifies computer-generated proofs of optimality on up to 8 inputs. The checker relies on an untrusted oracle to shortcut the search for witnesses on more than 1.6 million NP-complete subproblems.Comment: IMADA-preprint-c

    Improved quantum algorithms for the ordered search problem via semidefinite programming

    Get PDF
    One of the most basic computational problems is the task of finding a desired item in an ordered list of N items. While the best classical algorithm for this problem uses log_2 N queries to the list, a quantum computer can solve the problem using a constant factor fewer queries. However, the precise value of this constant is unknown. By characterizing a class of quantum query algorithms for ordered search in terms of a semidefinite program, we find new quantum algorithms for small instances of the ordered search problem. Extending these algorithms to arbitrarily large instances using recursion, we show that there is an exact quantum ordered search algorithm using 4 log_{605} N \approx 0.433 log_2 N queries, which improves upon the previously best known exact algorithm.Comment: 8 pages, 4 figure

    Bayesian Evidence and Model Selection

    Full text link
    In this paper we review the concepts of Bayesian evidence and Bayes factors, also known as log odds ratios, and their application to model selection. The theory is presented along with a discussion of analytic, approximate and numerical techniques. Specific attention is paid to the Laplace approximation, variational Bayes, importance sampling, thermodynamic integration, and nested sampling and its recent variants. Analogies to statistical physics, from which many of these techniques originate, are discussed in order to provide readers with deeper insights that may lead to new techniques. The utility of Bayesian model testing in the domain sciences is demonstrated by presenting four specific practical examples considered within the context of signal processing in the areas of signal detection, sensor characterization, scientific model selection and molecular force characterization.Comment: Arxiv version consists of 58 pages and 9 figures. Features theory, numerical methods and four application
    corecore