7,369 research outputs found
Functional Distributional Semantics
Vector space models have become popular in distributional semantics, despite the challenges they face in capturing various semantic phenomena. We propose a novel probabilistic framework which draws on both formal semantics and recent advances in machine learning. In particular, we separate predicates from the entities they refer to, allowing us to perform Bayesian inference based on logical forms. We describe an implementation of this framework using a combination of Restricted Boltzmann Machines and feedforward neural networks. Finally, we demonstrate the feasibility of this approach by training it on a parsed corpus and evaluating it on established similarity datasets
Hierarchical spatial-temporal state machine for vehicle instrument cluster manufacturing
The vehicle instrument cluster is one of the most advanced and complicated electronic embedded control systems used in modern vehicles providing a driver with an interface to control and determine the status of the vehicle. In this paper, we develop a novel hybrid approach called Hierarchical Spatial-Temporal State Machine (HSTSM). The approach addresses a problem of spatial-temporal inference in complex dynamic systems. It is based on a memory-prediction framework and Deep Neural Networks (DNN) which is used for fault detection and isolation in automatic inspection and manufacturing of vehicle instrument cluster. The technique has been compared with existing methods namely rule-based, template-based, Bayesian, restricted Boltzmann machine and hierarchical temporal memory methods. Results show that the proposed approach can successfully diagnose and locate multiple classes of faults under real-time working conditions
Recommended from our members
The Recurrent Temporal Discriminative Restricted Boltzmann Machines
Classification of sequence data is the topic of interest for dynamic Bayesian models and Recurrent Neural Networks (RNNs). While the former can explicitly model the temporal dependencies between class variables, the latter have a capability of learning representations. Several attempts have been made to improve performance by combining these two approaches or increasing the processing capability of the hidden units in RNNs. This often results in complex models with a large number of learning parameters. In this paper, a compact model is proposed which offers both representation learning and temporal inference of class variables by rolling Restricted Boltzmann Machines (RBMs) and class variables over time. We address the key issue of intractability in this variant of RBMs by optimising a conditional distribution, instead of a joint distribution. Experiments reported in the paper on melody modelling and optical character recognition show that the proposed model can outperform the state-of-the-art. Also, the experimental results on optical character recognition, part-of-speech tagging and text chunking demonstrate that our model is comparable to recurrent neural networks with complex memory gates while requiring far fewer parameters
A Deterministic and Generalized Framework for Unsupervised Learning with Restricted Boltzmann Machines
Restricted Boltzmann machines (RBMs) are energy-based neural-networks which
are commonly used as the building blocks for deep architectures neural
architectures. In this work, we derive a deterministic framework for the
training, evaluation, and use of RBMs based upon the Thouless-Anderson-Palmer
(TAP) mean-field approximation of widely-connected systems with weak
interactions coming from spin-glass theory. While the TAP approach has been
extensively studied for fully-visible binary spin systems, our construction is
generalized to latent-variable models, as well as to arbitrarily distributed
real-valued spin systems with bounded support. In our numerical experiments, we
demonstrate the effective deterministic training of our proposed models and are
able to show interesting features of unsupervised learning which could not be
directly observed with sampling. Additionally, we demonstrate how to utilize
our TAP-based framework for leveraging trained RBMs as joint priors in
denoising problems
Inferring Sparsity: Compressed Sensing using Generalized Restricted Boltzmann Machines
In this work, we consider compressed sensing reconstruction from
measurements of -sparse structured signals which do not possess a writable
correlation model. Assuming that a generative statistical model, such as a
Boltzmann machine, can be trained in an unsupervised manner on example signals,
we demonstrate how this signal model can be used within a Bayesian framework of
signal reconstruction. By deriving a message-passing inference for general
distribution restricted Boltzmann machines, we are able to integrate these
inferred signal models into approximate message passing for compressed sensing
reconstruction. Finally, we show for the MNIST dataset that this approach can
be very effective, even for .Comment: IEEE Information Theory Workshop, 201
- …