565 research outputs found

    On the Limited Communication Analysis and Design for Decentralized Estimation

    Full text link
    This paper pertains to the analysis and design of decentralized estimation schemes that make use of limited communication. Briefly, these schemes equip the sensors with scalar states that iteratively merge the measurements and the state of other sensors to be used for state estimation. Contrarily to commonly used distributed estimation schemes, the only information being exchanged are scalars, there is only one common time-scale for communication and estimation, and the retrieval of the state of the system and sensors is achieved in finite-time. We extend previous work to a more general setup and provide necessary and sufficient conditions required for the communication between the sensors that enable the use of limited communication decentralized estimation~schemes. Additionally, we discuss the cases where the sensors are memoryless, and where the sensors might not have the capacity to discern the contributions of other sensors. Based on these conditions and the fact that communication channels incur a cost, we cast the problem of finding the minimum cost communication graph that enables limited communication decentralized estimation schemes as an integer programming problem.Comment: Updates on the paper in CDC 201

    The Variational Homoencoder: Learning to learn high capacity generative models from few examples

    Full text link
    Hierarchical Bayesian methods can unify many related tasks (e.g. k-shot classification, conditional and unconditional generation) as inference within a single generative model. However, when this generative model is expressed as a powerful neural network such as a PixelCNN, we show that existing learning techniques typically fail to effectively use latent variables. To address this, we develop a modification of the Variational Autoencoder in which encoded observations are decoded to new elements from the same class. This technique, which we call a Variational Homoencoder (VHE), produces a hierarchical latent variable model which better utilises latent variables. We use the VHE framework to learn a hierarchical PixelCNN on the Omniglot dataset, which outperforms all existing models on test set likelihood and achieves strong performance on one-shot generation and classification tasks. We additionally validate the VHE on natural images from the YouTube Faces database. Finally, we develop extensions of the model that apply to richer dataset structures such as factorial and hierarchical categories.Comment: UAI 2018 oral presentatio

    Decentralized Observability with Limited Communication between Sensors

    Full text link
    In this paper, we study the problem of jointly retrieving the state of a dynamical system, as well as the state of the sensors deployed to estimate it. We assume that the sensors possess a simple computational unit that is capable of performing simple operations, such as retaining the current state and model of the system in its memory. We assume the system to be observable (given all the measurements of the sensors), and we ask whether each sub-collection of sensors can retrieve the state of the underlying physical system, as well as the state of the remaining sensors. To this end, we consider communication between neighboring sensors, whose adjacency is captured by a communication graph. We then propose a linear update strategy that encodes the sensor measurements as states in an augmented state space, with which we provide the solution to the problem of retrieving the system and sensor states. The present paper contains three main contributions. First, we provide necessary and sufficient conditions to ensure observability of the system and sensor states from any sensor. Second, we address the problem of adding communication between sensors when the necessary and sufficient conditions are not satisfied, and devise a strategy to this end. Third, we extend the former case to include different costs of communication between sensors. Finally, the concepts defined and the method proposed are used to assess the state of an example of approximate structural brain dynamics through linearized measurements.Comment: 15 pages, 5 figures, extended version of paper accepted at IEEE Conference on Decision and Control 201

    Cloud-based Quadratic Optimization with Partially Homomorphic Encryption

    Get PDF
    The development of large-scale distributed control systems has led to the outsourcing of costly computations to cloud-computing platforms, as well as to concerns about privacy of the collected sensitive data. This paper develops a cloud-based protocol for a quadratic optimization problem involving multiple parties, each holding information it seeks to maintain private. The protocol is based on the projected gradient ascent on the Lagrange dual problem and exploits partially homomorphic encryption and secure multi-party computation techniques. Using formal cryptographic definitions of indistinguishability, the protocol is shown to achieve computational privacy, i.e., there is no computationally efficient algorithm that any involved party can employ to obtain private information beyond what can be inferred from the party's inputs and outputs only. In order to reduce the communication complexity of the proposed protocol, we introduced a variant that achieves this objective at the expense of weaker privacy guarantees. We discuss in detail the computational and communication complexity properties of both algorithms theoretically and also through implementations. We conclude the paper with a discussion on computational privacy and other notions of privacy such as the non-unique retrieval of the private information from the protocol outputs

    Neuromarketing for a better understanding of consumer needs and emotions

    Get PDF
    In this paper we are talking about the fact that marketing and publicity specialists have become aware of the limitations of traditional market research methods for decades, but only in recent years science has allowed the development of a more effective mechanism by which consumers' thoughts can be deciphered: neuromarketing. This term refers to the use of techniques developed by cognitive neuroscience and psychology specialists to analyze and understand people's reactions to products and promotions, which allows refining marketing efforts to make them more effective. In the article we are talking about the tools used for this purpose, which include magnetic resonance imaging (MRI), brain scanners that identify brain parts that react to different stimuli, and electroencephalography (EEG), devices that measure electrical activity in the brain. By tracking brain reactions to different stimuli, researchers can discover the marketing mechanisms that are most likely to lead to the desired outcome: selling the product. For this, in parallel with the EEG measurements, an eye-tracking device is used, which allows the exact identification of the stimulus that produces the reaction from that moment. Also, some neuromarketing companies also use GSR (galvanic skin response) sensors to measure the electrical conductivity of the skin, which is another element that provides information about the consumer's response to various commercial messages. The purpose of our article is to show the role played by neuromarketing in the correct understanding of consumer needs, words and emotions
    • …
    corecore