15,333 research outputs found

    RBFNN-based Minimum Entropy Filtering for a Class of Stochastic Nonlinear Systems

    Get PDF
    The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link.This paper presents a novel minimum entropy filter design for a class of stochastic nonlinear systems which are subjected to non-Gaussian noises. Motivated by stochastic distribution control, an output entropy model is developed using RBF neural network while the parameters of the model can be identified by the collected data. Based upon the presented model, the filtering problem has been investigated while the system dynamics have been represented. As the model output is the entropy of the estimation error, the optimal nonlinear filter is obtained based on the Lyapunov design which makes the model output minimum. Moreover, the entropy assignment problem has been discussed as an extension of the presented approach. To verify the presented design procedure, a numerical example is given which illustrates the effectiveness of the presented algorithm. The contributions of this paper can be included as 1) an output entropy model is presented using neural network; 2) a nonlinear filter design algorithm is developed as the main result and 3) a solution of entropy assignment problem is obtained which is an extension of the presented framework

    An Optimal Control Derivation of Nonlinear Smoothing Equations

    Full text link
    The purpose of this paper is to review and highlight some connections between the problem of nonlinear smoothing and optimal control of the Liouville equation. The latter has been an active area of recent research interest owing to work in mean-field games and optimal transportation theory. The nonlinear smoothing problem is considered here for continuous-time Markov processes. The observation process is modeled as a nonlinear function of a hidden state with an additive Gaussian measurement noise. A variational formulation is described based upon the relative entropy formula introduced by Newton and Mitter. The resulting optimal control problem is formulated on the space of probability distributions. The Hamilton's equation of the optimal control are related to the Zakai equation of nonlinear smoothing via the log transformation. The overall procedure is shown to generalize the classical Mortensen's minimum energy estimator for the linear Gaussian problem.Comment: 7 pages, 0 figures, under peer reviewin

    Kalman meets Shannon

    Full text link
    We consider the problem of communicating the state of a dynamical system via a Shannon Gaussian channel. The receiver, which acts as both a decoder and estimator, observes the noisy measurement of the channel output and makes an optimal estimate of the state of the dynamical system in the minimum mean square sense. The transmitter observes a possibly noisy measurement of the state of the dynamical system. These measurements are then used to encode the message to be transmitted over a noisy Gaussian channel, where a per sample power constraint is imposed on the transmitted message. Thus, we get a mixed problem of Shannon's source-channel coding problem and a sort of Kalman filtering problem. We first consider the problem of communication with full state measurements at the transmitter and show that optimal linear encoders don't need to have memory and the optimal linear decoders have an order of at most that of the state dimension. We also give explicitly the structure of the optimal linear filters. For the case where the transmitter has access to noisy measurements of the state, we derive a separation principle for the optimal communication scheme, where the transmitter needs a filter with an order of at most the dimension of the state of the dynamical system. The results are derived for first order linear dynamical systems, but may be extended to MIMO systems with arbitrary order

    A Graphical Model Formulation of Collaborative Filtering Neighbourhood Methods with Fast Maximum Entropy Training

    Full text link
    Item neighbourhood methods for collaborative filtering learn a weighted graph over the set of items, where each item is connected to those it is most similar to. The prediction of a user's rating on an item is then given by that rating of neighbouring items, weighted by their similarity. This paper presents a new neighbourhood approach which we call item fields, whereby an undirected graphical model is formed over the item graph. The resulting prediction rule is a simple generalization of the classical approaches, which takes into account non-local information in the graph, allowing its best results to be obtained when using drastically fewer edges than other neighbourhood approaches. A fast approximate maximum entropy training method based on the Bethe approximation is presented, which uses a simple gradient ascent procedure. When using precomputed sufficient statistics on the Movielens datasets, our method is faster than maximum likelihood approaches by two orders of magnitude.Comment: ICML201
    • …
    corecore