629 research outputs found

    Efficient algorithms for conditional independence inference

    Get PDF
    The topic of the paper is computer testing of (probabilistic) conditional independence (CI) implications by an algebraic method of structural imsets. The basic idea is to transform (sets of) CI statements into certain integral vectors and to verify by a computer the corresponding algebraic relation between the vectors, called the independence implication. We interpret the previous methods for computer testing of this implication from the point of view of polyhedral geometry. However, the main contribution of the paper is a new method, based on linear programming (LP). The new method overcomes the limitation of former methods to the number of involved variables. We recall/describe the theoretical basis for all four methods involved in our computational experiments, whose aim was to compare the efficiency of the algorithms. The experiments show that the LP method is clearly the fastest one. As an example of possible application of such algorithms we show that testing inclusion of Bayesian network structures or whether a CI statement is encoded in an acyclic directed graph can be done by the algebraic method

    FRMDN: Flow-based Recurrent Mixture Density Network

    Full text link
    Recurrent Mixture Density Networks (RMDNs) are consisted of two main parts: a Recurrent Neural Network (RNN) and a Gaussian Mixture Model (GMM), in which a kind of RNN (almost LSTM) is used to find the parameters of a GMM in every time step. While available RMDNs have been faced with different difficulties. The most important of them is highāˆ’-dimensional problems. Since estimating the covariance matrix for the highāˆ’-dimensional problems is more difficult, due to existing correlation between dimensions and satisfying the positive definition condition. Consequently, the available methods have usually used RMDN with a diagonal covariance matrix for highāˆ’-dimensional problems by supposing independence among dimensions. Hence, in this paper with inspiring a common approach in the literature of GMM, we consider a tied configuration for each precision matrix (inverse of the covariance matrix) in RMDN as (\(\Sigma _k^{ - 1} = U{D_k}U\)) to enrich GMM rather than considering a diagonal form for it. But due to simplicity, we assume \(U\) be an Identity matrix and \(D_k\) is a specific diagonal matrix for \(k^{th}\) component. Until now, we only have a diagonal matrix and it does not differ with available diagonal RMDNs. Besides, Flowāˆ’-based neural networks are a new group of generative models that are able to transform a distribution to a simpler distribution and vice versa, through a sequence of invertible functions. Therefore, we applied a diagonal GMM on transformed observations. At every time step, the next observation, \({y_{t + 1}}\), has been passed through a flowāˆ’-based neural network to obtain a much simpler distribution. Experimental results for a reinforcement learning problem verify the superiority of the proposed method to the baseāˆ’-line method in terms of Negative Logāˆ’-Likelihood (NLL) for RMDN and the cumulative reward for a controller with fewer population size
    • ā€¦
    corecore