123 research outputs found

    Predictive Uncertainty through Quantization

    Get PDF
    High-risk domains require reliable confidence estimates from predictive models. Deep latent variable models provide these, but suffer from the rigid variational distributions used for tractable inference, which err on the side of overconfidence. We propose Stochastic Quantized Activation Distributions (SQUAD), which imposes a flexible yet tractable distribution over discretized latent variables. The proposed method is scalable, self-normalizing and sample efficient. We demonstrate that the model fully utilizes the flexible distribution, learns interesting non-linearities, and provides predictive uncertainty of competitive quality

    Graph Convolutional Matrix Completion

    Get PDF
    We consider matrix completion for recommender systems from the point of view of link prediction on graphs. Interaction data such as movie ratings can be represented by a bipartite user-item graph with labeled edges denoting observed ratings. Building on recent progress in deep learning on graph-structured data, we propose a graph auto-encoder framework based on differentiable message passing on the bipartite interaction graph. Our model shows competitive performance on standard collaborative filtering benchmarks. In settings where complimentary feature information or structured data such as a social network is available, our framework outperforms recent state-of-the-art methods.Comment: 9 pages, 3 figures, updated with additional experimental evaluatio

    Emerging Convolutions for Generative Normalizing Flows

    Get PDF
    Generative flows are attractive because they admit exact likelihood optimization and efficient image synthesis. Recently, Kingma & Dhariwal (2018) demonstrated with Glow that generative flows are capable of generating high quality images. We generalize the 1 x 1 convolutions proposed in Glow to invertible d x d convolutions, which are more flexible since they operate on both channel and spatial axes. We propose two methods to produce invertible convolutions that have receptive fields identical to standard convolutions: Emerging convolutions are obtained by chaining specific autoregressive convolutions, and periodic convolutions are decoupled in the frequency domain. Our experiments show that the flexibility of d x d convolutions significantly improves the performance of generative flow models on galaxy images, CIFAR10 and ImageNet.Comment: Accepted at International Conference on Machine Learning (ICML) 201

    Competing interactions in semiconductor quantum dots

    Get PDF
    We introduce an integrability-based method enabling the study of semiconductor quantum dot models incorporating both the full hyperfine interaction as well as a mean-field treatment of dipole-dipole interactions in the nuclear spin bath. By performing free induction decay and spin echo simulations we characterize the combined effect of both types of interactions on the decoherence of the electron spin, for external fields ranging from low to high values. We show that for spin echo simulations the hyperfine interaction is the dominant source of decoherence at short times for low fields, and competes with the dipole-dipole interactions at longer times. On the contrary, at high fields the main source of decay is due to the dipole-dipole interactions. In the latter regime an asymmetry in the echo is observed. Furthermore, the non-decaying fraction previously observed for zero field free induction decay simulations in quantum dots with only hyperfine interactions, is destroyed for longer times by the mean-field treatment of the dipolar interactions.Comment: 10 pages, 5 figures [v2: subsection and references added

    Sylvester Normalizing Flows for Variational Inference

    Get PDF
    Variational inference relies on flexible approximate posterior distributions. Normalizing flows provide a general recipe to construct flexible variational posteriors. We introduce Sylvester normalizing flows, which can be seen as a generalization of planar flows. Sylvester normalizing flows remove the well-known single-unit bottleneck from planar flows, making a single transformation much more flexible. We compare the performance of Sylvester normalizing flows against planar flows and inverse autoregressive flows and demonstrate that they compare favorably on several datasets.Comment: Published at UAI 2018, 12 pages, 3 figures, code at: https://github.com/riannevdberg/sylvester-flow

    Probing pairing correlations in Sn isotopes using Richardson-Gaudin integrability

    Get PDF
    Pairing correlations in the even-even A=102-130 Sn isotopes are discussed, based on the Richardson-Gaudin variables in an exact Woods-Saxon plus reduced BCS pairing framework. The integrability of the model sheds light on the pairing correlations, in particular on the previously reported sub-shell structure.Comment: Proceedings of the XX International School on Nuclear Physics, Neutron Physics and Applications, Varna, Bulgaria, 16-22 September, 201
    • …
    corecore