3,132 research outputs found

    Hierarchical Subquery Evaluation for Active Learning on a Graph

    Get PDF
    To train good supervised and semi-supervised object classifiers, it is critical that we not waste the time of the human experts who are providing the training labels. Existing active learning strategies can have uneven performance, being efficient on some datasets but wasteful on others, or inconsistent just between runs on the same dataset. We propose perplexity based graph construction and a new hierarchical subquery evaluation algorithm to combat this variability, and to release the potential of Expected Error Reduction. Under some specific circumstances, Expected Error Reduction has been one of the strongest-performing informativeness criteria for active learning. Until now, it has also been prohibitively costly to compute for sizeable datasets. We demonstrate our highly practical algorithm, comparing it to other active learning measures on classification datasets that vary in sparsity, dimensionality, and size. Our algorithm is consistent over multiple runs and achieves high accuracy, while querying the human expert for labels at a frequency that matches their desired time budget.Comment: CVPR 201

    A macro-realism inequality for opto-electro-mechanical systems

    Full text link
    We show how to apply the Leggett-Garg inequality to opto-electro-mechanical systems near their quantum ground state. We find that by using a dichotomic quantum non-demolition measurement (via, e.g., an additional circuit-QED measurement device) either on the cavity or on the nanomechanical system itself, the Leggett-Garg inequality is violated. We argue that only measurements on the mechanical system itself give a truly unambigous violation of the Leggett-Garg inequality for the mechanical system. In this case, a violation of the Leggett-Garg inequality indicates physics beyond that of "macroscopic realism" is occurring in the mechanical system. Finally, we discuss the difficulties in using unbound non-dichotomic observables with the Leggett-Garg inequality.Comment: 9 pages, 2 figures. Added additional figure (2b), and associated conten

    Likelihood-based Out-of-Distribution Detection with Denoising Diffusion Probabilistic Models

    Full text link
    Out-of-Distribution detection between dataset pairs has been extensively explored with generative models. We show that likelihood-based Out-of-Distribution detection can be extended to diffusion models by leveraging the fact that they, like other likelihood-based generative models, are dramatically affected by the input sample complexity. Currently, all Out-of-Distribution detection methods with Diffusion Models are reconstruction-based. We propose a new likelihood ratio for Out-of-Distribution detection with Deep Denoising Diffusion Models, which we call the Complexity Corrected Likelihood Ratio. Our likelihood ratio is constructed using Evidence Lower-Bound evaluations from an individual model at various noising levels. We present results that are comparable to state-of-the-art Out-of-Distribution detection methods with generative models.Comment: 9 pages (main paper), 3 pages (acknowledgements & references), 3 figures, 2 tables, 1 algorithm, work accepted for BMVC 202
    • …
    corecore